当前位置:首页 >> 英语学习 >>


经济学人完整整理版 Digest Of The. Economist. 2006(8-9)
The mismeasure of woman Men and women think differently. But not that differently IN THE 1970s there was a fad for giving dolls to baby boys and fire-engines to baby girls. The idea was that differences in behaviour between the sexes were solely the result of upbringing: culture turned women into ironers, knitters and chatterboxes, and men into hammerers, drillers and silent types. Switching toys would put an end to sexual sorting. Today, it is clear why it did not. When boys and girls are born, they are already different, and they favour different toys from the beginning. That boys and girls—and men and women—are programmed by evolution to behave differently from one another is now widely accepted. Surely, no one today would think of doing what John Money, of Johns Hopkins University, did in 1967: amputating the genitalia of a boy who had suffered a botched circumcision, and advising the parents to bring him up as a girl. The experiment didn't work, and the consequences were tragic. But which of the differences between the sexes are “biological”, in the sense that they have been honed by evolution, and which are “cultural” or “environmental” and might more easily be altered by changed circumstances, is still fiercely debated. The sensitivity of the question was shown last year by a furore at Harvard University. Larry Summers, then Harvard's president, caused a storm when he suggested that innate ability could be an important reason why there were so few women in the top positions in mathematics, engineering and the physical sciences. Even as a proposition for discussion, this is unacceptable to some. But biological explanations of human behaviour are making a comeback as the generation of academics that feared them as a covert way of justifying eugenics, or of thwarting Marxist utopianism, is retiring. The success of neo-Darwinism has provided an intellectual underpinning for discussion about why some differences between the sexes might be innate. And new scanning techniques have enabled researchers to examine the brain's interior while it is working, showing that male and female brains do, at one level, operate differently. The results, however, do not always support past cliché about what the differences in question actually are. s Differences in behaviour between the sexes must, in some way, be reflections of systematic differences between the brains of males and females. Such differences certainly exist, but drawing inferences from them is not as easy as it may appear. For a start, men's brains are about 9% larger than those of women. That used to be cited as evidence of men's supposedly greater intelligence. Actually, the difference is largely (and probably completely) explained by the fact that men are bigger than women. In recent years, more detailed examination has refined the picture. Female brains have a higher percentage of grey matter (the manifestation, en bloc, of the central bodies of nerve cells), and thus a lower percentage of white matter (the manifestation of the long, thin filaments that connect nerve cells together), than male brains. That, plus the fact that in some regions of the female brain, nerve cells are packed more densely than in men, means that the number of nerve cells in male and female brains may be similar. Oddly, though, the main connection between the two hemispheres of the brain, which is known as the corpus callosum and is made of white matter, is proportionately smaller in men than women. This may explain why men use only one side of the brain to process some problems for which women employ both sides. These differences in structure and wiring do not appear to have any influence on intelligence as measured by IQ tests. It does, however, seem that the sexes carry out these tests in different ways. In one example, where men and women perform equally well in a test that asks them to work out whether nonsense words rhyme, brain scanning shows that women use areas on both the right and the left sides of the brain to accomplish the task. Men, by contrast, use only areas on the left side. There is also a correlation between mathematical reasoning and temporal-lobe activity in men—but none in women. More generally, men seem to rely more on their grey matter for their IQ, whereas women rely more on their white matter. American exceptionalism The world's biggest insurance market is too splintered KANSAS CITY, Missouri, is known more for its historical role as a cattle town than as a financial hub. But it is to this midwestern city, America's 26th largest, that regulators and insurance executives from around the globe head when they want to make sense of the world's largest—and one of its weirdest—insurance markets. For it is in Kansas City that the National Association of Insurance Commissioners (NAIC) is housed. It oversees a market

accounting for one-third of premiums written worldwide. Outside Kansas City, the market becomes a regulatory free-for-all. Each of America's 50 states, plus the District of Colombia, governs its insurance industry in its own way. In an increasingly global insurance market, America's state-based system is coming under strong pressure to reform. Insurance has changed dramatically since the NAIC was set up in 1871, with growing sophistication in underwriting and risk management. Premiums in America have ballooned to $1.1 trillion and market power is increasingly concentrated in the hands of big players (some of them foreign-owned) that are pushing for an overhaul of the state-based system. “It's an extremely expensive and Byzantine process,” says Bob Hartwig, an economist with the Insurance Information Institute, a research group. Though a fiercely political issue, congressional support for simplifying the system is gaining ground. Both houses of Congress are looking at proposals to change the state-based system. Big insurers favour a version that would implement an optional federal charter allowing them to bypass the state-bystate regulatory process if they choose. A similar system already exists for banks. Proponents of the changes see more efficiency, an ability to roll out products more quickly nationally and, ultimately, better offerings for consumers as a result. Yet some consumer groups favour state-based regulation. They believe it keeps premiums lower than they otherwise would be. Premiums as a percentage of gross output are lower in America than in several other countries. The political headwinds are strong: insurance commissioners are elected officials in some states (California, for instance) and appointed by the governor in others. The industry is also split: most of the country's 4,500 insurers are small, and many of them have close ties with state-based regulators, whose survival they support. But even these forces may eventually be overcome. Elsewhere in the industry in America, there are other calls for reform. In a backdoor form of protectionism, American reinsurance firms have long benefited from a regulation that requires foreign reinsurers writing cross-border business into America to post more collateral than they do. “If you operate outside the borders of the US, they don't trust you one inch,” laments Julian James, head of international business at Lloyd's of London, which writes 38% of its business in America. The collateral requirement was established because of worries about regulatory standards abroad, and the financial strength of global reinsurers. Today regulatory standards have been tightened in many foreign markets. A majority of America's reinsurance cover now comes from firms based abroad, including many that have set up offshore in Bermuda (for tax reasons) primarily to serve America. Too hot to handle Dell's battery recall reveals the technology industry's vulnerabilities THERE is the nail test, in which a team of engineers drives a large metal nail through a battery cell to see if it explodes. In another trial, laboratory technicians bake the batteries in an oven to simulate the effects of a digital device left in a closed car on a sweltering day—to check the reaction of the chemicals inside. On production runs, random batches of batteries are tested for temperature, efficiency, energy density and output. But the rigorous processes that go into making sophisticated, rechargeable batteries—the heart of billions of electronic gadgets around the world—were not enough. On August 14th Dell, a computer company, said it would replace 4.1m lithium-ion batteries made by Sony, a consumer-electronics firm, in laptop computers sold between 2004 and last month. A handful of customers had reported the batteries overheating, catching fire and even exploding—including one celebrated case at a conference this year in Japan, which was captured on film and passed around the internet. The cost to the two companies is expected to be between $200m and $400m. In some ways, Dell is a victim of its success. The company was a pioneer in turning the personal computer into a commodity, which meant squeezing suppliers to the last penny, using economies of scale by placing huge orders, and running efficient supply chains with little room for error. It all created a volatile environment in which mistakes can have grave effects. Since lithium-ion batteries were introduced in 1991, their capacity to overheat and burst into flame has been well known. Indeed, in 2004 America banned them as cargo on passenger planes, as a fire hazard. But the latest problems seem to have arisen because of the manufacturing process, which demands perfection. “If there is even a nano-sized particle of dust, a small metal shard or water condensation that gets into the battery cell, it can overheat and explode,” says Sara Bradford of Frost & Sullivan, a consultancy. As the energy needs of devices have grown rapidly, so have the demands on batteries. The computing industry's culture is also partly to blame. Firms have long tried to ship products as fast as they possibly can, and they may have set less store by quality. They used to mock the telecoms industry's ethos of “five-nines”—99.999% reliability—because it meant long product cycles. But now they are gradually accepting it as a benchmark. That is partly why Microsoft has taken so long to perfect its new operating system, Windows Vista. Compared with other product crises, from contaminated Coca-Cola in 1999 to Firestone's faulty tyres in 2000, Dell can be

complimented for quickly taking charge of a hot situation. The firm says there were only six incidents of laptops overheating in America since December 2005—but the internet created a conflagration. Keeping the faith Mixing religion and development raises soul-searching questions WORLD Bank projects are usually free of words like “faith” and “soul.” Most of its missions speak the jargon of development: poverty reduction, aggregate growth and structural adjustments. But a small unit within the bank has been currying favour with religious groups, working to ease their suspicions and use their influence to further the bank's goals. In many developing countries, such groups have the best access to the people the bank is trying to help. The programme has existed for eight years, but this brainchild of the bank's previous president, James Wolfensohn, has spent the past year largely in limbo as his successor, Paul Wolfowitz, decides its future. Now, some religious leaders in the developing world are worried that the progress they have made with the bank may stall. That progress has not always been easy. The programme, named the Development Dialogue on Values and Ethics, faced controversy from the start. Just as religious groups have struggled to work with the bank, many people on the inside doubted if the bank should be delving into the divine. Critics argued that religion could be divisive and political. Some said religion clashes with the secular goals of modernisation. Although the bank does not lend directly to religious groups, it works with them to provide health, educational and other benefits, and receives direct input from those on the ground in poor countries. Katherine Marshall, director of the bank's faith unit, argues that such groups are in an ideal position to educate people, move resources and keep an eye on corruption. They are organised distribution systems in otherwise chaotic places. The programme has had success getting evangelical groups to fight malaria in Mozambique, improve microcredit and water distribution in India, and educate people about AIDS in Africa. “We started from very different viewpoints. The World Bank is looking at the survival of a country, we look at the survival of a patient,” says Leonardo Palombi, of the Community of Sant'Egidio, an Italian church group that works in Africa. Although the work continues, those involved in Mr Wolfensohn's former pet project now fret over its future. Some expect the faith unit to be transferred to an independent organisation also set up by Mr Wolfensohn, the World Faiths Development Dialogue, which will still maintain a link with the bank. Religious groups are hoping their voices will still be heard. “If we are going to make progress, faith institutions need to be involved. We believe religion has the ability to bring stability. It will be important for the bank to follow through,” says Agnes Abuom, of the World Council of Churches for Africa, based in Kenya. Like religious groups, large institutions such as the bank can resist change. Economists and development experts are sometimes slow to believe in new ideas. One positive by-product of the initiative is that religious groups once wary of the bank's intentions are less suspicious. Ultimately, as long as both economists and evangelists aim to help the poor attain a better life on earth, differences in opinion about the life hereafter do not matter much. Stand and deliver For the first time since the epidemic began, money to fight AIDS is in plentiful supply. It is now time to convert words into action KEVIN DE COCK, the World Health Organisation's AIDS supremo, is not a man to mince his words. He reckons that he and his colleagues in the global AIDS establishment have between five and seven years to make a real dent in the problem. If they fail, the world's attention span will be exhausted, charitable donors and governments will turn to other matters and AIDS will be relegated in the public consciousness to being yet another intractable problem of the poor world about which little or nothing can be done. For now, though, the money is flowing. About $8.9 billion is expected to be available this year. And, regardless of Dr De Cock's long-term worries, that sum should rise over the next few years. Not surprisingly, a lot of people are eager to spend it. Many of those people—some 24,000 of them—have been meeting in Toronto at the 16th International AIDS Conference. An AIDS conference is unlike any other scientific meeting. In part, it is a jamboree in which people try to out-do each other in displays of cultural inclusiveness: the music of six continents resonates around the convention centre. In part, it is a lightning conductor that allows AIDS activists to make their discontent known to the world in a series of semi-official protests. It is also what other scientific meetings are, a forum for the presentation of papers with titles such as “Differing lymphocyte cytokine responses in HIV and Leishmania co-infection”. But mostly, it is a giant council of war. And at this one, the generals are trying to impose a complete change of military strategy. When AIDS was discovered, there was no treatment. Existing anti-viral drugs were tried but at best they delayed the inevitable,

and at worst they failed completely. Prevention, then, was not merely better than cure, it was the only thing to talk about. Condoms were distributed. Posters were posted exhorting the advantages of safe sex. Television adverts were run that showed the consequences of carelessness. Ten years ago, though, a new class of drugs known as protease inhibitors was developed. In combination with some of the older drugs, they produced what is now known as highly active anti-retroviral therapy or HAART. In most cases, HAART can prolong life indefinitely. That completely changed the picture. Once the AIDS activists had treated themselves, they began to lobby for the poor world to be treated, too. And, with much foot-dragging, that is now happening. About 1.6m people in low- and middle-income countries, 1m of them in sub-Saharan Africa, are now receiving anti-AIDS drugs routinely. The intention, announced at last year's G8 meeting in Scotland, is that the drugs should be available by 2010 to all who would benefit from them. However, those on drugs remain infected and require treatment indefinitely. To stop the epidemic requires a re-emphasis of prevention, and it is that which the organisers have been trying to do. Man, deconstructed The DNA that may have driven the evolution of the human brain ONE of the benefits of knowing the complete genetic sequences of humans and other animals is that it becomes possible to compare these blueprints. You can then work out what separates man from beast—genetically speaking, at least. The human brain sets man apart. About 2m years ago it began to grow in size, and today it is about three times larger than that of chimpanzees, man's closest relative. Human intelligence and behavioural complexity have far outstripped those of its simian cousins, so the human brain seems to have got more complex, as well as bigger. Yet no study has pinpointed the genetic changes that cause these differences between man and chimp. Now a group of scientists believe they have located some interesting stretches of DNA that may have been crucial in the evolution of the human brain. A team led by David Haussler of the Howard Hughes Medical Institute in California, compared the human genome with that of mammals including other primates. They reported the results in Nature. The researchers looked at the non-human genomes first, seeking regions that had not changed much throughout evolutionary history. Regions that are untouched by normal random changes typically are important ones, and thus are conserved by evolution. Next the researchers found the equivalent regions in the human genome to see if any were very different between humans and chimps. Such a sudden change is a hallmark of a functional evolutionary shift. They found 49 regions they dubbed “human accelerated regions” (HARs) that have shown a rapid, recent evolution. Most of these regions are not genes as commonly understood. This is because they code for something other than the proteins that are expressed in human cells and that regulate biological processes. A number of the HARs are portions of DNA that are responsible for turning genes on and off. Intriguingly, the most rapidly changing region was HAR1, which has accumulated 18 genetic changes when only one would be expected to occur by chance. It codes for a bit of RNA (a molecule that usually acts as a template for translating DNA into protein) that, it is speculated, has some direct function in neuronal development. HAR1 is expressed before birth in the developing neocortex—the outer layer of the brain that seems to be involved in higher functions such as language, conscious thought and sensory perception. HAR1 is expressed in cells that are thought to have a vital role in directing migrating nerve cells in the developing brain. This happens at seven to 19 weeks of gestation, a crucial time when many of the nerve cells are establishing their functions. Without more research, the function of HAR1 remains mere speculation. But an intriguing facet of this work is that, until now, most researchers had focused their hunt for differences on the protein-coding stretches of the genome. That such a discovery has been made in what was regarded as the less interesting parts of the human genome is a presage of where exciting genomic finds may lie in the future. Keeping it real How to make digital photography more trustworthy PHOTOGRAPHY often blurs the distinction between art and reality. Modern technology has made that blurring easier. In the digital darkroom photographers can manipulate images and threaten the integrity of endeavours that rely on them. Several journalists have been fired for such activity in recent months, including one from Reuters for faking pictures in Lebanon. Earlier this year, the investigation into Hwang Woo-suk showed the South Korean scientist had changed images purporting to show

cloning. In an effort to reel in photography, camera-makers are making it more obvious when images have been altered. One way of doing this is to use image-authentication systems to reveal if someone has tampered with a picture. These use computer programs to generate a code from the very data that comprise the image. As the picture is captured, the code is attached to it. When the image is viewed, software determines the code for the image and compares it with the attached code. If the image has been altered, the codes will not match, revealing the doctoring. Another way favoured by manufacturers is to take a piece of data from the image and assign it a secret code. Once the image file is transferred to a computer, it is given the same code, which will change if it is edited. The codes will match if the image is authentic but will be inconsistent if tampering occurred. The algorithm is the weapon of choice for Hany Farid, a computer scientist at Dartmouth College in New Hampshire. Digital images have natural statistical patterns in the intensity and texture of their pixels. These patterns change when the picture is manipulated. Dr Farid's algorithms detect these changes, and can tell if pixels have been duplicated or removed. They also try to detect if noise—the overexposed pixels within the image that create a grainy effect—was present at the time the photograph was taken or has been added later. However, forgers have become adept at printing and rescanning images, thus creating a new original. In such cases, analysing how three-dimensional elements interact is key. Long shadows at midday are a giveaway. Even the tiny reflections in the centre of a person's pupil tell you about the surrounding light source. So Dr Farid analyses shadows and lighting to see if subjects and surroundings are consistent. For its part, Adobe, the maker of Photoshop software, has improved its ability to record the changes made to an image by logging how and when each tool or filter was used. Photoshop was the program used by the journalist fired by Reuters; his handiwork left a pattern in the smoke he had added that was spotted by bloggers. Thus far the internet has proven an effective check on digital forgery. Although it allows potentially fake images to be disseminated widely, it also casts many more critical eyes upon them. Sometimes the best scrutiny is simply more people looking. Collateral damage Why the war in Iraq is surprisingly bad news for America's defence firms WHEN Boeing announced on August 18th that it planned to shut down production of the C-17, a huge military cargo plane, the news sent a shiver through the American defence industry. As it winds down its production line at Long Beach, California, over the next two years, Boeing will soon begin to notify suppliers that their services will no longer be needed. It had to call a halt, because orders from America's Defence Department had dried up and a trickle of export deals could not take their place. The company would not support the cost of running the production line for the C-17 (once one of its biggest-selling aircraft) on the off-chance that the Pentagon might change its mind and place further orders. The wider worry for the defence industry is that this could be the first of many big programmes to be shut down. A big part of the problem is that America is at war. The need to find an extra $100 billion a year to pay for operations in Iraq means there is pressure to make cuts in the defence budget, which has been provisionally set at $441 billion for the fiscal year beginning in October. American defence budgets involve a complicated dance starting with what the Pentagon wants, what the White House thinks it should get and, finally, what Congress allows it to get away with. Although the armed forces' extra spending on ammunition, fuel, provisions, medicines and accommodation in Iraq does not strictly come out of the same budget as new weapons, the heavy bill for fighting eventually leads to calls to save money on shiny new equipment. Earlier this month, for example, the Congressional Budget Office expressed “major concerns” about Future Combat Systems, a $165 billion project to upgrade all of the army's vehicles and communications networks. The scheme is the Pentagon's second-biggest development programme and is intended to give the soldiers on the ground access to real-time battlefield information from sources such as satellites and unmanned aircraft. But the programme was initially expected to cost about $82 billion, half the latest estimate, and critics are also worried about how well it will work and whether it will be delivered on time. Last week the army issued a glowing progress report on the project and insisted that Boeing and Science Applications International Corporation, the lead contractors, are on schedule. This was welcome news to defence contractors worried that the grandiose project might fall victim to pressure for budget cuts. Even so, the prospects for many other big weapons programmes are less rosy. The problem is not just the cost of the fighting in Iraq, but also its nature. The shift in the style of warfare, towards such “asymmetric” conflicts, means that there is now less demand for big-ticket weapons systems. Things were simpler in the cold war, when the Pentagon spent about $150 billion a year on new weapons. That fell to around $50 billion after the fall of the Berlin Wall.

America's 15 main defence contractors reacted by consolidating into today's top five. When he became president, George Bush promised to increase defence spending, and he has done so: the procurement budget is back up to nearly $160 billion, despite the lack of a Soviet Union. As a result, the five main defence contractors—Lockheed Martin, Boeing, Northrop Grumman, General Dynamics and Raytheon—have had a wonderful five years. Since the terrorist attacks in September 2001, their sales have risen by around 10% a year. Last year their combined profits increased by 25% to almost $13 billion. Although most of the defence budget is spent on big weapons systems that are of little or no use in the fight against terrorists, the political climate after the attacks of September 11th 2001 made it impossible to oppose the administration's desire to increase defence spending. Besides, such spending means more jobs, often in areas where there is little other manufacturing. More media, less news Newspapers are making progress with the internet, but most are still too timid, defensive or high-minded THE first thing to greet a visitor to the Oslo headquarters of Schibsted, a Norwegian newspaper firm, is its original, hand-operated printing press from 1856, now so clean and polished it looks more like a sculpture than a machine. Christian Schibsted, the firm's founder, bought it to print someone else's newspaper, but when the contract moved elsewhere he decided to start his own. Although Schibsted gives pride of place to its antique machinery, the company is in fact running away from its printed past as fast as it can. Having made a loss five years ago, Schibsted's activities on the internet contributed 35% of last year's operating profits. News of Schibsted's success online has spread far in the newspaper industry. Every year, says Sverre Munck, the executive vice-president of its international business, Schibsted has to turn away delegations of foreign newspaper bosses seeking to find out how the Norwegians have done it. “Otherwise we'd get several visits every month,” he says. The company has used its established newspaper brands to build websites that rank first and second in Scandinavia for visitors. It has also created new internet businesses such as Sesam, a search engine that competes with Google, and FINN.no, a portal for classified advertising. As a result, 2005 was the company's best ever for revenues and profits. Unfortunately for the newspaper industry, Schibsted is a rare exception. For most newspaper companies in the developed world, 2005 was miserable. They still earn almost all of their profits from print, which is in decline. As people look to the internet for news and young people turn away from papers, paid-for circulations are falling year after year. Papers are also losing their share of advertising spending. Classified advertising is quickly moving online. Jim Chisholm, of iMedia, a joint-venture consultancy with IFRA, a newspaper trade association, predicts that a quarter of print classified ads will be lost to digital media in the next ten years. Overall, says iMedia, newspapers claimed 36% of total global advertising in 1995 and 30% in 2005. It reckons they will lose another five percentage points by 2015. Even the most confident of newspaper bosses now agree that they will survive in the long term only if, like Schibsted, they can reinvent themselves on the internet and on other new-media platforms such as mobile phones and portable electronic devices. Most have been slow to grasp the changes affecting their industry—“remarkably, unaccountably complacent,” as Rupert Murdoch put it in a speech last year—but now they are making a big push to catch up. Internet advertising is growing rapidly for many and is beginning to offset some of the decline in print. Newspapers' complacency is perhaps not as remarkable as Mr Murdoch suggested. In many developed countries their owners have for decades enjoyed near monopolies, fat profit margins, and returns on capital above those of other industries. In the past, newspaper companies saw little need to experiment or to change and spent little or nothing on research and development. Changing connections Fatherhood alters the structure of your brain—if you are a marmoset PARENTING has obvious effects on mothers, but fathers appear to be affected, too. A study published this week shows that fatherhood increases the nerve connections in the region of the brain that controls goal-driven behaviour—at least, it does in marmosets. Pregnancy and motherhood have long been known to bring about changes—many of them positive—to the female brain. Pregnant and nursing rats have a greater number of neural connections, particularly in the region of the brain that controls hormones and maternal behaviour. The brain changes coincide with improvements in spatial memory and speedier foraging skills, which might help a mother rat protect and feed her young.

Just what effect parenting might have on the brains of fathers has remained an open question, however. Male rats sometimes eat their young rather than nurture them, which makes them a poor model for studying how fatherhood affects the brains of species that frown on infanticide. Marmoset fathers on the other hand are a model of paternal devotion. They carry their babies for more than half the time during the offspring's first three months, passing them to the mother only when the babies need to be fed. Elizabeth Gould of Princeton University and her colleagues compared the brains of marmoset fathers with those of males that lived in mated pairs, but lacked offspring. They found substantial differences. The nerve cells in the prefrontal cortex of fathers had more tiny projections, known as dendritic spines, than those of non-fathers. Because dendritic spines are the sites of connection between neighbouring nerve cells, the increased number may mean more activity in the fathers' brains than in those of the nonfathers. The nerve cells in the prefrontal cortex of the fathers also contained more receptors for vasopressin, a small peptide that influences social behaviour and pair-bonding. The researchers also showed that, as the offspring aged and fathers became detached from them, the abundance of vasopressin receptors fell. This suggests that the parental behaviour is associated with the changes in the brain. The work is published this week in Nature Neuroscience. What does this mean for human fathers? It is hard to tell. The attention of marmoset fathers makes them an extreme example of fatherhood. Human fathers do not usually get involved to anything like the same extent. That said, the scientists looked at the marmoset's prefrontal cortex because earlier studies had shown that activity in the prefrontal cortex of human parents—male and female alike—increases when they see their own offspring. Thus the same brain region is active in parenting in both species. Craig Kinsley of the University of Richmond, Virginia, who did the work with rat mothers, speculates that Dr Gould's new findings may reflect human behaviour quite closely. “There is a lot of interest in the idea that having children forces responsibility on males in many respects. If you consider that the prefrontal cortex plays a major role in planning, judgment and the anticipation of the consequences of behaviour, you could make a clear argument that the changes in that part of the brain would be involved with judicious attention toward offspring.” Life 2.0 The new science of synthetic biology is poised between hype and hope. But its time will soon Come IN 1965 few people outside Silicon Valley had heard of Gordon Moore. For that matter, no one at all had heard of Silicon Valley. The name did not exist and the orchards of Santa Clara county still brought forth apples, not Macintoshes. But Mr Moore could already discern the outlines. For 1965 was the year when he published the paper that gave birth to his famous “law” that the power of computers, as measured by the number of transistors that could be fitted on a silicon chip, would double every 18 months or so. Four decades later, equally few people have heard of Rob Carlson. Dr Carlson is a researcher at the University of Washington, and some graphs of the growing efficiency of DNA synthesis that he drew a few years ago look suspiciously like the biological equivalent of Moore's law. By the end of the decade their practical upshot will, if they continue to hold true, be the power to synthesise a string of DNA the size of a human genome in a day. At the moment, what passes for genetic engineering is mere pottering. It means moving genes one at a time from species to species so that bacteria can produce human proteins that are useful as drugs, and crops can produce bacterial proteins that are useful as insecticides. True engineering would involve more radical redesigns. But the Carlson curve (Dr Carlson disavows the name, but that may not stop it from sticking) is making that possible. In the short run such engineering means assembling genes from different organisms to create new metabolic pathways or even new organisms. In the long run it might involve re-writing the genetic code altogether, to create things that are beyond the range of existing biology. These are enterprises far more worthy of the name of genetic engineering than today's tinkering. But since that name is taken, the field's pioneers have had to come up with a new one. They have dubbed their fledgling discipline “synthetic biology”. No intelligent designer would have put the genomes of living organisms together in the way that evolution has. Some parts overlap, meaning that they cannot change jobs independently of one another. Others have lost their function but have not been removed, so they simply clutter things up. And there is no sense of organisation or hierarchy. That is because, unlike an engineer, evolution cannot go back to the drawing board, it can merely play with what already exists. Biologists, who seek merely to understand how life works, accept this.

Pay dirt Buckets, not big irrigation systems, can prevent the world running dry WILL the world run short of water to grow crops? Not if it invests in the right projects, according to a group of scientists and economists that has been studying the question for the past five years. The “Comprehensive Assessment”, co-ordinated by the Consultative Group on International Agricultural Research, an international network of research institutes, will not be released until November. But at a recent conference in Sweden, they revealed some preliminary findings. The good news is that small investments in infrastructure for water can yield big returns. Unfortunately, governments and donors have concentrated on more grandiose but less helpful schemes, leaving a third of the world's population—some 2 billion people—short of water. Agriculture sucks up perhaps 95% of the water humans use. It takes roughly 3,000 litres to grow enough food for one person for one day, or about one litre for each calorie. Moreover, the world's population is growing, and people are eating more than they used to. So the assessment's projections suggest that if nothing changes, agriculture will consume twice as much water by 2050 as it does today. That is a tall order: as it is, some 900m people, the assessment finds, live in river basins where humans consume more than 75% of the water, leaving barely enough to keep rivers flowing and lakes filled. Another 700m live in basins rapidly approaching this “closed” state; 1 billion more live within reach of adequate water supplies, but cannot afford to gain access to them. The water table is falling fast in densely populated and poor regions of China, Mexico and India. In theory, the world should still have more than enough water to feed everyone under most circumstances, thanks to sodden places like Canada and Russia. But exploiting the surplus would require much more trade in food from damp spots to the parched ones. A few poor, dry and teeming countries, such as Egypt, along with the odd rich one, like Japan, already depend on imports of food. But most governments are loth to put their citizens at the mercy of the world's imperfect markets. Instead, governments have tended to try to increase agricultural output through expensive irrigation projects. But smaller investments in simple devices, such as pumps to tap groundwater, are faster to deploy, yield greater returns on capital and bring fewer environmental and social problems. A recent study of vegetable farmers in Ghana, for example, found that those irrigating their fields with wastewater carried by buckets earned a 230% return on their investment, versus 30% for big state-sponsored schemes. The assessment argues that modest outlays on rain-fed agriculture, in particular, could drastically improve the productivity of farming in poor countries and so help both to raise farmers' incomes and also to cut the need for an expansion of agriculture elsewhere. More than half of the world's food comes from rain-fed farms, as opposed to irrigated ones. If the rains fail, so do the crops. Channels to harvest and direct rainfall and small, sealed reservoirs or tanks to store it, would not only see farmers through dry spells, but also allow them to entice bigger or more valuable harvests out of the same fields. More reliable income, in turn, allows farmers to invest more in seeds, fertiliser and machinery. In Tanzania rainwater harvesting allows farmers to grow rice or vegetables instead of staples like sorghum and maize. These dearer crops bring in at least twice the revenue and up to seven times as much in good years. If adopted on a grand scale, the assessment argues, such techniques could double crop yields. In that case, the area under cultivation globally would have to rise by only 10% to satisfy growing demand for food by 2050—and there would be plenty of water to go round. On the hiking trail Globalisation is generating huge economic gains. That is no reason to ignore its costs VISITORS to Jackson Hole, Wyoming, normally see a few moose and buffalo and sometimes even a bear. But in late August each year some really strange creatures can be spotted: central bankers and economists, meeting for the annual symposium of the Federal Reserve Bank of Kansas City, one of the high points (literally) of the economic calendar. This year's conference focused on how the rise of China, India and other countries is reshaping the world economy. Take wages. It is commonly believed that the wages of unskilled workers in rich countries are being depressed by the shift of jobs to low-wage countries. However, a paper presented at the symposium by Gene Grossman and Esteban Rossi-Hansberg of Princeton University offered a much rosier view, arguing that offshoring can actually increase the wages of unskilled workers. Moving jobs abroad boosts firms' productivity and profits and so enables them to take on more workers at home, which pushes up the wages for work that cannot easily be outsourced abroad. The authors are right to point out that the impact of offshoring on jobs is not as bad as it is usually portrayed. But their own calculations show that between 1997 and 2004 this positive “productivity

effect” was not large enough to offset the downward pressure on wages, both from having more workers in the world and from cheaper labour-intensive goods as a result of imports from low-wage countries. The net impact is still to depress the wages of lowskilled workers. Indeed, the evidence is that the low-skilled are not the only people being squeezed. In America, the euro area and Japan, total wages have fallen to their lowest share of national income in decades, whereas the share of profits has surged. This is exactly what would have been expected, given that the integration into the world economy of the emerging economies has sharply increased the ratio of global labour to capital. Yet this fact barely got a mention in Jackson Hole. In their eagerness to applaud the benefits of globalisation to economies as a whole, economists were strangely reluctant to admit that in recent years the average real pay of rich-country workers has stagnated or even fallen. Ben Bernanke, the chairman of the Federal Reserve, was one of the few to voice the case for helping the losers. The scale and pace of globalisation, he argued, is unprecedented and the overall gains will be huge. But there is a risk of social and political opposition as some workers lose their jobs. Policymakers, he said, need “to ensure that the benefits of global economic integration are sufficiently widely shared” so as to maintain support for free trade and to stem protectionism. The snag is that the number of losers—including those facing lower real wages—may be bigger than he thinks. The non-denial of the non-self How philosophy can help create secure databases IN THE 1940s a philosopher called Carl Hempel showed that by manipulating the logical statement “all ravens are black”, you could derive the equivalent “all non-black objects are non-ravens”. Such topsy-turvy transformations might seem reason enough to keep philosophers locked up safely on university campuses, where they cannot do too much damage. However, a number of computer scientists, led by Fernando Esponda of Yale University, are taking Hempel's notion as the germ of an eminently practical scheme. They are applying such negative representations to the problem of protecting sensitive data. The idea is to create a negative database. Instead of containing the information of interest, such a database would contain everything except that information. The concept of a negative database took shape a couple of years ago, while Dr Esponda was working at the University of New Mexico with Paul Helman, another computer scientist, and Stephanie Forrest, an expert on modelling the human immune system. The important qualification concerns that word “everything”. In practice, that means everything in a particular set of things. What interested Dr Esponda was how the immune system represents information. Here, “everything” is the set of possible biological molecules, notably proteins. The immune system is interesting, because it protects its owner from pathogens without needing to know what a pathogen will look like. Instead, it relies on a negative database to tell it what to destroy. It learns early on which biological molecules are “self”, in the sense that they are routine parts of the body it is protecting. Whenever it meets one that is “not self” and thus likely to be part of a pathogen, it destroys it. In Hempel's terms, this can be expressed as “all non-good agents [pathogens] are non-self”. The analogy with a computer database is not perfect. The set of possible biomolecules is not infinite, but it is certainly huge, and probably indeterminable. The immune system does not care about this, because it has to recognise only what is not in its own database. Make one adjustment, though, and you have something that might work for computers. That adjustment is to define “everything” as a finite set, all of whose members can be known—for instance, all phrases containing a fixed maximum number of characters. A database of names, addresses and Social Security numbers (a common form of identification in America) might require only 200 characters to contain all possible combinations. That would limit the total number of character combinations. A positive database containing all the data in question would be a small subset of those combinations. The negative counterpart of this database would be much larger and contain all possible names and addresses that were not in the positive database plus a lot of gibberish. But it would not be infinite. By looking at the negative database, it would be possible to deduce what was in the positive database it complemented. That would not guarantee security against a search for the presence or absence of a particular name and address. Indeed, the whole point is that such searches should be possible. But it would prevent fishing expeditions by making it impossible, for example, to look for the Social Security numbers of all the people living on one street.

Would you fly in chattering class? The use of mobile phones on planes moves another step closer LIKE it or not, the prospect of being able to use mobile phones on aeroplanes is inching ever closer. Last week Ryanair, a European low-cost carrier, announced that it would equip its entire fleet of Boeing 737s with small base stations, called picocells, provided by OnAir, a technology company backed by Airbus, Europe's aviation giant. The picocells will use satellite links to allow mobile phones to be used during flight without interfering with ground-based networks. (Such interference, rather than safety concerns, is the primary reason that in-flight use of mobile phones is banned at the moment.) Taking a cut of the resulting revenues will help Ryanair to keep its ticket prices down, according to Michael O'Leary, the firm's boss. But it is uncertain just how popular, and hence how lucrative, in-flight calling will be. The technical obstacles have been overcome and regulatory approval is expected soon—at least in Europe. Regulators are expected to issue guidelines in the next few weeks defining which frequencies can be used and national aviation authorities will start certifying airlines' installations early next year. OnAir says it expects its technology to be approved in time for Air France to launch in-air calling on an Airbus A318 in the first quarter of 2007. Mobile phones must still surmount social obstacles, however. Will people flock to airlines that offer inflight calling, or avoid them? And how much will callers be prepared to pay? OnAir, its rival AeroMobile and other firms have conducted a series of surveys in an attempt to find out. According to the latest survey, released by OnAir this week, 80% of airline passengers approve of the idea of being able to use telephones on aircraft, even if they do not plan to do so themselves. Indeed, only 54% of business travellers and 41% of leisure travellers said they would switch their phones on during a flight. One reason is cost: George Cooper, the boss of OnAir, says that at prices above $3 per minute, “demand drops off considerably”, according to the firm's research. He expects $2.50 per minute to be the norm when services based on his firm's technology are launched by Air France, Ryanair, bmi and TAP Portugal. But OnAir will then cut its prices by 10% a year for five years, he says. So the availability of in-flight communications seems likely to vary widely. Offering such a service, or not offering it, could enable airlines to differentiate themselves on particular routes. They could also impose quiet periods or mobile-free cabins. On long-haul flights, texting and Wi-Fi access might prove more attractive than voice-calling, provided the price is right—though Boeing recently axed Connexion, its impressive but little used satellite-based in-flight broadband service. In-flight calling may in fact prove best suited to short-haul, low-cost flights. After all, people choose Ryanair and other low-cost carriers because of their low prices—not because they enjoy the flight. So the prospect of other passengers shouting into their phones may not put people off. Quite the opposite, in fact, if it can help to subsidise low ticket prices. Under attack Why are drug patents now coming under legal attack? AT FIRST blush, the Plavix affair appears to be an astonishing victory for Apotex, a cheeky Canadian drugs firm that launched a generic version of the world's second-bestselling drug. For a few weeks in August the generic version captured nearly three-quarters of the American market for the $6 billion-ayear drug—until Sanofi-Aventis and Bristol-Myers Squibb, the makers of Plavix, asked a judge to halt sales, which he did on August 31st. Look closer, though, and it becomes clear that Apotex will be a winner only if it defeats Sanofi's patents on Plavix in a case that starts in January. Regardless of the outcome, the case raises an important legal question: how can a firm like Apotex bring a generic drug to market when others own a perfectly valid patent for it? The answer is that the potential prize is simply so large these days that the reward outweighs the risk of legal defeat. The multi-billion dollar sales of today's blockbuster drugs have invited greater legal scrutiny of patents and encouraged generics firms to find ways to innovate around them. The result is a relentless legal attack on branded drugs. Eli Lilly has seen its patents on Prozac, Evista and Zyprexa, three of its biggest-selling drugs, challenged by generics firms. Pfizer faces frequent patent challenges on Lipitor, the cholesterol-reducing remedy that is the world's bestselling drug. Dozens of other patent challenges are pending. Curiously, rather than using the law to defend their patents, big firms often settle out of court. Sanofi and Bristol-Myers Squibb made just such a deal with Apotex, but it was deemed to be illegal. Shire, a British firm that makes a drug to combat attention-deficit disorder, got Barr Laboratories to agree in August to delay its generic launch until 2009. But why would a firm with a legal patent strike such a deal? One reason could be that some drugs giants regard settlements as

a way to bribe a generics firm to delay its introduction of a cut-price product. American antitrust officials worry this is to the detriment of the consumer. Another explanation is that the cost and legal uncertainty associated with patent trials are simply too great. Daniel Glazer of Shearman and Sterling, a big law firm, argues that even a firm convinced of the integrity of its patents may well settle “to avoid the all-or-nothing scenario”. But there is a less charitable explanation. The big firm may know that its patent was mistakenly awarded, perhaps because the purported breakthrough was too minor or obvious. In Barr's ongoing case against Eli Lilly's Evista, the generic firm argues that a prior patent held by the University of Pennsylvania invalidates Lilly's claims. Kathleen Jaeger of America's Generic Pharmaceutical Association adds that branded firms try to extend their lucrative monopolies by filing less rigorous secondary patents designed “to block generics”. David Balto, a former official at America's Federal Trade Commission, says, “Branded pharmaceutical firms have been stretching the limits of what deserves a patent, and the courts are just catching up.” Ready or not Europe's financial sector is ill prepared for a coming upheaval SOME of the most breathless commentary about Europe's financial markets in recent years has centred on the intrigues and dalliances of leading financial exchanges. All of them have flirted with, encouraged and snubbed various potential partners in both Europe and America, although no big deals have yet been completed. Amid the chatter, an important cause of all the matchmaking and matchbreaking has been largely overlooked: a piece of looming legislation that, for all its drab detail, will alter the European Union's financial markets profoundly. Exchanges are not the only ones to feel the hot breath of the unenticingly labelled Markets in Financial Instruments Directive, known as MiFID, which is due to take effect from November 2007. An important element of the EU's plan for a single market in financial services, the directive embraces both wholesale and retail trading in securities, including shares, bonds and derivatives. As such, it will affect companies from investment banks to asset managers and stockbrokers. Some will benefit more than others . Charlie McCreevy, the European commissioner in charge of forging a single market, jokes about the ugly moniker: “This is not a fearsome man-eating plant.” But he is evangelical about the directive's purpose. He expects MiFID to “transform” the trading of securities in Europe, reducing the cost of capital, creating growth and increasing Europe's competitiveness in the global economy. The directive, which EU member states are supposed to weave into their own laws by January 2007, intends to accomplish all this in several ways. First, the rules aim to increase competition across borders, by extending the “single passport”, which allows financial firms to do business across Europe armed only with the approval of their home authorities. To make this possible, investor-protection rules are also to be harmonised, so as to provide a (theoretically) consistent standard in areas such as investment advice, order-handling and the completion of securities trades—“best execution”, in the jargon. Second, MiFID aims to change the nature of competition in share trading. Although most shares in Europe are still traded on exchanges, there is growing interest in alternatives, such as off-exchange trading between investment banks. MiFID could accelerate this trend. In some countries—notably France, Italy and Spain—existing rules force all share trades through local bourses. The new rules will end those monopolies. No wonder exchanges, facing the threat of greater competition, are weighing up mergers. A third intention of MiFID is more transparency. In future, investors should be able to subscribe to information services that let them see the whole market in certain shares, not only what is on offer at the local stock exchange. The goal is to let investors find the best prices in the market. This will mean competition for the London Stock Exchange, for example, which earns a healthy sum from selling such information. Investment banks are already banding together to develop alternative reporting services. Checking the thermostat Property prices are cooling fast in America, but heating up elsewhere HOUSES are not just places to live in; they are increasingly important to whole economies, which is why The Economist started publishing global house-price indicators in 2002. This has allowed us to track the biggest global property-price boom in history. The latest gloomy news from America may suggest that the world is on the brink of its biggest ever house-price bust. However, our latest quarterly update suggests that, outside America, prices are perking up. America's housing market has certainly caught a chill. According to the Office of Federal Housing Enterprise Oversight (OFHEO), the average price of a house rose by only 1.2% in the second quarter, the smallest gain since 1999. The past year has seen the sharpest slowdown in the rate of growth since the series started in 1975. Even so, average prices are still up by 10.1% on a year ago. This is much stronger than the series published by the National Association of Realtors (NAR), which showed a rise of

only 0.9% in the year to July. The OFHEO index is thought to be more reliable because it tracks price changes in successive sales of the same houses, and so unlike the NAR series is not distorted by a shift in the mix of sales to cheaper homes. The snag is that the data take time to appear. Prices for this quarter, which will not be published until December, may well be much weaker. A record level of unsold homes is also likely to weigh prices down. The housing futures contract traded on the Chicago Mercantile Exchange is predicting a fall of 5% next year. Elsewhere, our global house-price indicators signal a cheerier story. House-price inflation is faster than a year ago in roughly half of the 20 countries we track. Apart from America, only Spain, Hong Kong and South Africa have seen big slowdowns. In ten of the countries, prices are rising at double-digit rates, compared with only seven countries last year. European housing markets—notably Denmark, Belgium, Ireland, France and Sweden—now dominate the top of the league. Anecdotal evidence suggests that even the German market is starting to wake up after more than a decade of flat or falling prices, but this has yet to show up the index that we use, which is published with a long lag (there are no figures for 2006). If any readers know of a more timely index, please let us know. Some economists have suggested that Britain and Australia are “the canaries in the coal mine”, giving early warning of the fate of America's housing market. The annual rate of increase in house prices in both countries slowed from around 20% in 2003 to close to zero last summer. However, the canaries have started to chirp again. In Australia average prices have picked up by 6.4% over the past year, although this is partly due to a 35% surge in Perth on the back of the commodities boom. Likewise British home prices have perked up this year, to be 6.6% higher, on average, than they were a year ago. Thus it is claimed that housing markets in Britain and Australia have had a soft landing. Mind the gap Pay discrimination between male and female scientists SEVEN years ago, a group of female scientists at the Massachusetts Institute of Technology produced a piece of research showing that senior women professors in the institute's school of science had lower salaries and received fewer resources for research than their male counterparts did. Discrimination against female scientists has cropped up elsewhere. One study—conducted in Sweden, of all places—showed that female medical-research scientists had to be twice as good as men to win research grants. These pieces of work, though, were relatively small-scale. Now, a much larger study has found that discrimination plays a role in the pay gap between male and female scientists at British universities. Sara Connolly, a researcher at the University of East Anglia's school of economics, has been analyzing the results of a survey of over 7,000 scientists and she has just presented her findings at this year's meeting of the British Association for the Advancement of Science in Norwich. She found that the average pay gap between male and female academics working in science, engineering and technology is around ? 1,500 ($2,850) a year. That is not, of course, irrefutable proof of discrimination. An alternative hypothesis is that the courses of men's and women's lives mean the gap is caused by something else; women taking “career breaks” to have children, for example, and thus rising more slowly through the hierarchy. Unfortunately for that idea, Dr Connolly found that men are also likely to earn more within any given grade of the hierarchy. Male professors, for example, earn over ? 4,000 a year more than female ones. To prove the point beyond doubt, Dr Connolly worked out how much of the overall pay differential was explained by differences such as seniority, experience and age, and how much was unexplained, and therefore suggestive of discrimination. Explicable differences amounted to 77% of the overall pay gap between the sexes. That still left a substantial 23% gap in pay, which Dr Connolly attributes to discrimination. Besides pay, her study also looked at the “glass-ceiling” effect—namely that at all stages of a woman's career she is less likely than her male colleagues to be promoted. Between postdoctoral and lecturer level, men are more likely to be promoted than women are, by a factor of between 1.04 and 2.45. Such differences are bigger at higher grades, with the hardest move of all being for a woman to settle into a professorial chair. Of course, it might be that, at each grade, men do more work than women, to make themselves more eligible for promotion. But that explanation, too, seems to be wrong. Unlike the previous studies, Dr Connolly's compared the experience of scientists in universities with that of those in other sorts of laboratory. It turns out that female academic researchers face more barriers to promotion, and have a wider gap between their pay and that of their male counterparts, than do their sisters in industry or research institutes independent of universities. Private enterprise, in other words, delivers more equality than the supposedly egalitarian world of academia does.

Alpha betting The industry is splitting in two—and investors are gambling on the expensive bit IT HAS never been easier to pay less to invest. No fewer than 136 exchange-traded funds (ETFs) were launched in the first half of 2006, more than in the whole of 2005. For those who believe in efficient markets, this represents a triumph. ETFs are quoted securities that track a particular index, for a fee that is normally just a fraction of a percentage point. They enable investors to assemble a low-cost portfolio covering a wide range of assets from international equities, through government and corporate bonds, to commodities. Morgan Stanley estimates that ETFs control some $487 billion of assets, up 16.7% from a year ago. It predicts they will have $2 trillion of assets by 2011. No longer must investors be at the mercy of error-prone and expensive fund managers. But as fast as the assets of ETFs and index-tracking mutual funds are growing, another section of the industry seems to be flourishing even faster. Watson Wyatt, a firm of actuaries, estimates that “alternative asset investment” (ranging from hedge funds through private equity to property) grew by around 20% in 2005, to $1.26 trillion. Investors who take this route pay much higher fees in the hope of better performance. One of the fastest-growing assets, funds of hedge funds, charge some of the highest fees of all. At first sight, this might seem like a typical market, with low-cost commodity producers at one end and high-charging specialists at the other. Buy a Rolls-Royce rather than a Trabant and you can expect a higher standard of luxury and engineering in return for the much greater price. But fund management is not like any other industry; paying more does not necessarily get you a better service. An index represents the average performance of all investors, before costs are deducted. If the fee paid to the fund manager increases, the return achieved by the average investor must decline. After fees, hedge-fund returns this year have been feeble. From January 1st through to August 31st, the average hedge fund returned just 4.2%, according to Merrill Lynch, less than the S&P 500 index's 5.8% total return. So why are people paying up? In part, because investors have learned to distinguish between the market return, dubbed beta, and managers' outperformance, known as alpha. “Why wouldn't you buy beta and alpha separately?” asks Arno Kitts of Henderson Global Investors, a fund-management firm. “Beta is a commodity and alpha is about skill.” The fund-management splits began with the decline of balanced managers, which took complete charge of an investor's portfolio, running everything from American equities through Japanese bonds to property. Clients became convinced that no one firm could produce good performance in every asset class, nor could they master the art of timing the switch from one asset to another. Powering up Improved devices may make better use of sunlight MOST of the power generated by mankind originates from the sun. It was sunlight that nurtured the early life that became today's oil, gas and coal. It is the solar heating of the Earth's atmosphere and oceans that fuels wave power, wind farms and hydroelectric schemes. But using the sun's energy directly to generate power is rare. Solar cells account for less than 1% of the world's electricity production. Recent technological improvements, however, may boost this figure. The root of the problem is that most commercial solar cells are made from silicon, and silicon is expensive. Cells can be made from other, cheaper materials, but these are not as efficient as those made from silicon. The disparity is stark. Commercial silicon cells have efficiencies of 15% to 20%. In the laboratory, some have been made with an efficiency of 30%. The figure for non-traditional cells is far lower. A typical cell based on electrically conductive plastic has an efficiency of just 3% or 4%. What is needed is a way to boost the efficiency of cells made from cheap materials, and three new ways of doing so were unveiled this week in San Francisco, at the annual meeting of the American Chemical Society. Solar cells work by the action of light on electrons. An electron held in a chemical bond in the cell absorbs a photon (a particle of light) and, thus energised, breaks free. Such electrons can move about and, if they all move in the same direction, create an electric current. But they will not all travel in the same direction without a little persuasion. With silicon, this is achieved using a secondary electrical field across the cell. Non-silicon cells usually have a built-in “electrochemical potential” that encourages the electrons to move away from areas where they are concentrated and towards places where they have more breathing space.

Kwanghee Lee of Pusan National University, in South Korea, and Alan Heeger of the University of California, Santa Barbara, work on solar cells made of electrically conductive plastics. (Indeed, Dr Heeger won a Nobel prize for discovering that some plastics can be made to conduct electricity.) They found that by adding titanium oxide to such a cell and then baking it in an oven, they could increase the efficiency with which it converted solar energy into electricity. The trick is to put the titanium oxide in as a layer between the part of the cell where the electrons are liberated and the part where they are collected for dispatch into the wider world. This makes the electrically conductive plastic more sensitive to light at wavelengths where sunlight is more intense. Pop the resulting sandwich in the oven for a few minutes at 150° and the plastic layer C becomes crystalline. This improves the efficiency of the process, because the electrons find it easier to move through crystalline structures. The technique used by Dr Lee and Dr Heeger boosts the efficiency of plastic cells to 5.6%. That is still poor compared with silicon, but it is a big improvement on what was previously possible. Dr Lee concedes that there is still a long way to go, but says that even an efficiency of 7% would bring plastic cells into competition with their silicon cousins, given how cheap they are to manufacture. A second approach, taken by Michael Gr? of the Swiss Federal Institute of Technology, is to copy nature. Plants absorb tzel solar energy during photosynthesis. They use it to split water into hydrogen ions, electrons and oxygen. The electrons released by this reaction are taken up by carrier molecules and then passed along a chain of such molecules before being used to power the chemical reactions that ultimately make sugar. Dolling up the dole A better way to help America's jobless “MANY of our most fundamental systems—the tax code, health coverage, pension plans, worker training—were created for the world of yesterday, not tomorrow. We will transform these systems.” With these words George Bush laid out an agenda of domestic reform at the Republican convention in 2004. That agenda, starting with last year's attempt to transform America's vast state pension system, has gone nowhere. But Mr Bush's basic argument is right. Much of the machinery of America's domestic economic policy dates from the 1930s and needs repair. Unemployment insurance is a case in point. Created by Franklin Roosevelt in 1935, America's dole has barely changed since. It provides temporary income support to laid-off workers and is financed by a small tax on wages. The details vary from state to state, but full-time workers who lose their jobs get a cheque worth, on average, just over a third of their previous wage for up to six months. Benefits can be paid for longer if the economy is in recession, but only if Congress agrees. By European standards, America's dole is short-lived, a characteristic that encourages people to get a new job quickly. As a macroeconomic tool, the dole works well. Unemployment cheques support spending when workers are laid off, helping to smooth the business cycle. But the cash is not aimed at those who need it most. That is because a rising share of the unemployed are not laid off temporarily, but have seen their jobs disappear for good. Research by Erica Groshen and Simon Potter, of the Federal Reserve Bank of New York, suggests that whereas temporary lay-offs explained much of the jumps in unemployment during the recessions of the 1970s and early 1980s, nowadays structural job losses dominate. People who are unemployed because their job has gone permanently need to find new lines of work. It takes them longer to find a job and, when they do, they are often paid considerably less than before. Jeffrey Kling, an economist at the Brookings Institution, argues that the unemployment-benefit system ought to distinguish those who are temporarily out of a job but may find similar, or higher-paid work, and those who face permanently lower income. In a paper for the Hamilton Project, a research programme at Brookings that seeks new policies for America's centre-left, Mr Kling suggests that the dole should become less like a handout from the government and more like an insurance policy that individual workers finance themselves. The idea is to give every worker an account, unsnappily called a “temporary earnings replacement account” or TERA. While in work, people could set aside money in these accounts. Those who lose their jobs could take cash out. The level and duration of withdrawals would be set by the government and would be the same as under today's unemployment system. Bitter consequences Green vegetables really do taste horrible “EAT up your greens” is the exasperated cry of many a parent when faced with a fussy child. But the paradox of vegetables is that they are both good and bad for you. The cultivated plants consumed by all folks except hunter-gatherers have evolved an

ambiguous relationship with people, in which they exchange the risk of being eaten by a human for the reproductive security that domestication brings. But the wild plants from which these cultivars are descended are very definite about the matter. They do not want to be consumed and they make that opinion known by deploying all sorts of poisonous chemicals to discourage nibbling herbivores. In many cases, those poisons have persisted into the cultivated varieties, albeit at lower levels. Animals, of course, have evolved ways of dealing with these poisons. The best of these, from a plant's point of view, is when an animal can taste, and thus reject, a poisonous chemical. This has long been assumed to be the basis of the taste of bitterness, but that theory has only now been put to a clear test. In a paper just published in Current Biology, Mari Sandell and Paul Breslin, of the Monell Chemical Senses Centre, in Philadelphia, have looked at the phenomenon in that bê noire of presidents and parents alike: te broccoli. Bitter tastes are detected by receptor proteins that are, in turn, encoded by a family of genes known collectively as TAS2R. Humans have around 200 TAS2R genes, each sensitive to different groups of chemicals. That variety, in itself, indicates the range of the plant kingdom's weaponry. Dr Sandell and Dr Breslin, though, focused their attentions on just one of these receptor genes, called hTAS2R38. The protein derived from this gene is known, from laboratory experiments, to be sensitive to a substance called phenylthiocarbamide (PTC). This compound contains a molecular group called thiourea. And thiourea-containing substances are known from other studies to inhibit the function of the thyroid gland. Cruciferous vegetables, such as watercress, turnips and—most pertinently—broccoli, are rich in a group of thiourea-containing compounds called glucosinolates. Dr Sandell and Dr Breslin wondered if there might be a connection. And, since different versions of hTAS2R38 code for proteins that have different levels of reaction to PTC, they wondered if that might be reflected in the fact that some people like broccoli, and others do not. The two researchers assembled a group of volunteers and checked which versions of the hTAS2R38 gene they had. They then fed the volunteers vegetables and recorded their reactions. All of the vegetables were thought by at least some people to be bitter, but not all of them were cruciferous plants. The non-cruciferous ones were plants which, so far as is known, do not contain glucosinolates. The results were clear. All volunteers found the non-cruciferous vegetables equally bitter, but their reactions to the cruciferous ones depended on their genes. Those with two copies of the version of hTAS2R38 coding for the protein that binds best to PTC (one copy having been inherited from each parent) thought broccoli and its cousins the most bitter. Those who had two copies of the poorly binding version thought they tasted fine. Those with one of each had an intermediate reaction. Despite broccoli's bad reputation, the most bitter vegetables, according to this research, are swedes and turnips. That accords well with work which has shown that eating these vegetables suppresses the uptake of iodine into the thyroid gland. Iodine is an essential ingredient of thyroxine, a hormone produced by that gland. The upshot of all this is that the complaints of children (and, indeed, of many adults) that green vegetables are horrid contains a lot of truth. There is no doubt that such vegetables are good for you. But they are not unequivocally good. As is often observed in other contexts, there is no free lunch. Running rings round storms Trees keep records of passing hurricanes STUDYING the past is a good way to understand the present, and may even illuminate the future. But the past does not give up its secrets easily. Hurricane scientists, for instance, would like to know about long-term changes in the frequency and strengths of the storms they study. That would help to show whether the shifting pattern of hurricanes seen in the past few decades is cyclical, random or part of a trend that might be caused by global warming. Unfortunately, meteorologists have been keeping systematic tabs on the relevant data for only about 60 years. Before that, records are sporadic and anecdotal—and that is not enough to see the bigger picture. Human records, however, are not the only sort available. Trees are popular with scientists who want to look at what happened a few hundred years ago, because their annual growth rings mean that their wood can be dated accurately. And Dana Miller, of the University of Tennessee, and her team have used that insight to search for hurricanes that humanity has failed to record. Their results, just published in the Proceedings of the National Academy of Sciences, have identified a number of previously unknown storms that hit the south-west coast of North America. The trick they used to do this was to look at the isotopic composition of the oxygen in the wood of local trees. Water contains two isotopes of oxygen, one of which has two more neutrons than the other, making it heavier. When a hurricane forms, it tends, initially, to rain water molecules containing the heavier isotope. At that point it is still over the sea.

Conversely, the rain that falls from an old storm has more light oxygen in it—and that is the sort of rain that tends to fall on land. When this rain enters the soil, some of it is taken up by trees and incorporated into their wood. So, by measuring the ratio of the two isotopes in the rings of trees, and matching the result to the age of the ring, a history of hurricanes spanning the life of the tree can be reconstructed. Dr Miller's trees of choice were longleaf pines on the Valdosta State University campus in Georgia. Some were still alive, and some were in the form of stumps left from logging that took place on the site at the beginning of the 20th century. To test her method, Dr Miller looked first at the period from 1940 to 1990 (America began making accurate hurricane records in 1940). She was able to identify all 18 years when storms had affected the university campus in that time, though the method also suggested a storm in one year (1943) when there had been none. Going back further into history, she looked at 1855-1939. Here the records for landfalls are still good, although storms were not tracked accurately before they landed. Again, the method identified all known years with storms (21 in total), and indicated one other year when a hurricane had passed. That may or may not have been a false positive. Going back further still, to 1770-1854, the records are much more patchy. In this case the trees suggested hurricanes had passed by in 25 years, whereas only ten stormy years had been recorded. Clearly, the method is not perfect, as the false positive in 1943 shows. But over a period of centuries such errors will come out in the statistical wash. And, by extending the method to other sites (Dr Miller now has data from South Carolina and Florida, as well as Georgia), it should be possible to look for longterm patterns in the frequency of hurricanes and to answer questions that are of interest to insurance companies and environmentalists alike. Still aloft, just Maglev trains lose some of their magic SCHOOLBOY dreams of everyday travel by magnetic levitation (maglev) have been shaken but not destroyed after an accident on a test track at Lathen, Germany. On September 22nd a maglev train ploughed into a maintenance vehicle at 200kph (125mph), killing 23 passengers. Magnetic attraction and repulsion allow maglev trains to travel along guideways at speeds of up to 500kph. Transrapid, a consortium of Siemens and ThyssenKrupp, has been testing maglev trains at Lathen since the mid-1980s. Central Japan Railway (JR Central) has another test track west of Tokyo, which uses more cumbersome technology, relying on super-cooling. This week JR Central said it would spend ? billion ($3 billion) on lengthening and renewing the track, part of it underground, to 355 experiment with longer and faster trains. In both cases the technology is expensive, so that maglev trains need public backing. Only one, which runs the 30km between Shanghai and Pudong Airport in China, is operating commercially. It was built by Transrapid and helped along by a large subsidy from the German government. Other projects are in the pipeline, but none has got beyond a feasibility study. China plans to extend the Shanghai line for another 160km to the city of Hangzhou. But Transrapid has not yet secured the contract, which may depend on an agreement to share the technology. In Germany, Transrapid and Deutsche Bahn would like to build a 37km link between Munich's airport and the central railway station, cutting the average journey time from 40 minutes to ten. The federal and Bavarian governments are in favour, but the local mayor is not: he prefers a conventional railway which would serve non-airport traffic too. Even if he loses the argument, there is no guarantee the government will find the money for the Transrapid line—whose role would be partly to show off the system to potential buyers. Longer-distance projects in Germany were considered and then rejected, mainly because the country is well served by high-speed trains between big cities. Hamburg to Berlin, which might have been the exception, does not have enough demand from passengers. It may be that maglev trains stand a better chance in countries where existing railways are less efficient, such as America and Britain. But finding public money is a problem for even the most serious schemes: the California Regional Maglev Project, the Baltimore-Washington Maglev Project and the Pennsylvania High-Speed Maglev Project. Another American scheme, which may belong more in the realm of science fiction, is SkyTran, proposed by UniModal, of Montana. It involves two-seater maglev bubbles scuttling around a three-dimensional urban network or between cities at up to 240kph. Alternatively, of course, they could remain a white elephant. The crash has prompted a closer look at safety, even though the cause was probably human error. In theory, because of the way they are powered, maglev trains can never collide. Whatever the outcome, schoolboys will continue to dream of trains that never touch the ground.

Curve ball A link between unemployment and inflation is fashionable again IF HAIRCUTS and dress styles can come back into fashion, then so can economic theories. That is why policymakers have recently been debating the implications of the shape of that very 1960s concept, the Phillips curve. The Phillips curve was named after A.W. Phillips, whose research suggested a trade-off between British unemployment and wage inflation over the period 1861 to 1957. The curve was widely used in the 1960s. Economists developed models that gave politicians a neat way to find the right balance between the two factors; for every percentage-point fall in unemployment, inflation would rise by, say, half a point. But in the 1970s, the trade-off between unemployment and inflation seemed to evaporate; both rose at the same time, a phenomenon known as stagflation. As Stephen King, chief economist at HSBC, says, “The Phillips curve relationship did exist as long as governments weren't perceived to be exploiting it. When they did, they ended up with accelerating inflation.” Even before the curve began to break down in practice, Milton Friedman had cast doubt on the theory, as had Edmund Phelps, another American economist. The doubters argued that workers would demand higher wages to protect themselves against the rise in inflation. Thus the only effect of government stimulus would be to increase inflation for the same level of employment; in the long run, the Phillips curve became vertical. So the Phillips curve fell out of favour and was replaced by its corollary, the NAIRU, or non-accelerating inflation rate of unemployment (in effect, the natural rate). Economists spent much of the 1980s and 1990s debating what the rate might be. In the late 1990s indeed, many forecast that the Federal Reserve would be forced to raise interest rates to counter inflationary pressures when unemployment fell below 6% (and then 5%). But the Fed decided that productivity improvements had driven down the NAIRU and so left policy on hold. Growth duly flourished without causing inflation at the consumer level—although some argue that the laxity of monetary policy caused the tech bubble. Why has the Phillips curve, displaced by the NAIRU and the output gap (which suggests that inflation will rise when economic growth is above trend), come back into the economic debate after so long in the cold? In part because, while the NAIRU and the output gap are nice ideas, it is often hard to agree, at any given moment, on the value of either number. But the main reason is that the relationship between unemployment and inflation has settled down again. Low unemployment has not been accompanied by significant increases in inflation; in other words, the Phillips curve has flattened considerably.

经济学人翻译 最新 2014 .10 Weaker than it looks纯净版
经济学人翻译 最新 2014 .10 Weaker than it looks纯净_英语学习_外语学习_教育专区。中英文对照的一直说涉及版权问题。那就只能发单单中文翻译的了。 sorryWe...
(考研英语阅读原文很多来自《经济学人》,希望大家好好看看) 印度的救赎 IN MAY...With an election due by May 2014, some fear that the Congress-led ...
经济学人英文版_经济/市场_经管营销_专业资料。Taking its medicine.自尝苦果...不幸的是,辉瑞的这些行为违反了美国专门用来定罪海外行贿的《反海外腐败法(...
托福资料打包:《经济学人》双语版PDF_英语考试_外语学习_教育专区。智课网 TOEFL 备考资料 托福资料打包:《经济学人》双语版 PDF 摘要: 很多考生在备考托福的...
2014 年 1 月《经济学人》 英文原文: The $9 trillion sale Governments should launch a new wave of privatisations, this time centred on property Jan ...
经济学人(英文版) .2014... 暂无评价 80页 1下载券 经济学人原版The Econom...2014完整版《经济学人》... 17页 4下载券 《经济学人》2014年10期 暂无评价...
2014 英语一阅读 text2 :题干法巧破主旨 选项法妙解细节 2014 年 04 月 18 日 18:18:42 这篇文章选自2013.2.2这一期《经济学人》杂志,是关于美国法律...
2011年最新《经济学人》英文对照_英语考试_外语学习_教育专区。2011年最新《经济...84页 免费©2014 Baidu 使用百度前必读 | 文库协议...
考研英语阅读英文原刊《经济学人》:收入与幸福感_研究生入学考试_高等教育_教育...But in 2014, 54% of rich-country respondents counted themselves as happy,...
考研英语阅读英文原刊《经济学人》:家庭酿造_研究生入学考试_高等教育_教育专区。凯程考研辅导班,中国最强的考研辅导机构,http://www.kaichengschool.com 考研就找...