“THE chairman of your [board's] compensation committee should be richer than you and older than you,” one of America's most admired bosses advised a private gathering of 50 chief executives in New York last November. “That way, he won't get jealous when you make your fortune. In fact, he should be someone who loves to see other people get rich. Under no circumstances should he be from the public sector, or a professor.” Another boss provoked groans when he confessed: “I once made the mistake of giving the job to a distinguished academic.”
“Greed, for lack of a better word, is good. Greed is right. Greed works.” This credo by Michael Douglas, as Gordon Gekko in the 1987 film “Wall Street”, seemed to capture the spirit of the decade, with its sharp-suited investment bankers using mountains of debt to buy up sleepy old companies, fire most of the workers and make themselves a fortune. But compared with the past ten years, the greed of the 1980s was as nothing. And whereas the 1980s story was all about greedy Wall Streeters battling against company bosses who wanted to preserve their firm and its traditional values, in the 1990s a shared greed nurtured a symbiotic relationship between Wall Street and company bosses that made rich men (and, increasingly, women) of them all.
The case for greed was perhaps best made over 200 years ago by Adam Smith, who argued that the invisible hand of market forces would ensure that the efforts of individuals acting in pursuit of their own self-interest made society as a whole better off. In other words, judge capitalism not by the motives of the capitalists but by its fruit. Until recently, the fruit of the 1990s double act of investment bankers and company bosses looked both tasty and abundant, especially in America, where greed was given the freest rein. The economy grew more rapidly, productivity increased faster and the jobless rate fell further than anybody had thought possible. Profits soared, as did the stockmarket, spreading wealth to investors of all kinds, from fat-cat managers with share options to ordinary workers with stakes in retirement funds. It all seemed ample vindication for those real-world 1980s Gekkos (Ivan Boesky, Michael Milken, Henry Kravis et al) who argued that the way to ensure that corporate America created wealth for shareholders was to give management a piece of the action.
Doubts started to creep in first with the popping of the dotcom bubble, then with the broader drift in share prices and the economic downturn. In America, the ratio of households' net worth to income has fallen back to 5.3, down from its 1999 peak of 6.3, though still well above its long-term norm of 4. The optimists ascribe this simply to the ups and downs of the business cycle, and there is some truth in that. Yet for all the virtues of America's style of capitalism, many of the recent problems were the natural result of bad incentives. If the current slowdown changes those incentives, it will achieve something useful.
The wrong carrots
Managers' share options were supposed to solve the “agency problem” at the heart of the modern shareholder-owned company. The trouble with having owners who are not managers, and managers who are not owners, is that the managers, as agents of the owners, may not run the firm in the best interests of the shareholders. Handing the managers share options gives them a powerful incentive to put the interests of shareholders first. In the 1990s, when this idea gained widespread acceptance, options spread rapidly through corporate America, and, less rapidly, in other rich countries too. What the theory did not allow for was that share prices could deviate substantially from their fundamental value, and that management could help this process along in the short term. The short term might be long enough for them to exercise their share options and sell the shares before the market caught on.
Options also happened to encourage behaviour that was good for Wall Street. In the 1980s, managers had often put up fierce resistance to their firm being bought, not least because they might well lose their jobs. But share options changed their incentives: because the options vest the moment a firm changes hands, they can make a takeover positively welcome to the managers. That suited the investment banks, which are constantly encouraging mergers and acquisitions because of the huge fees they generate—notwithstanding the lamentable economic record of most mergers. The managers at the firm that does the buying do not benefit from vesting options, but they are routinely offered another carrot: a huge bonus for pulling off the deal.
It is above all in America that company boards fail conspicuously to ensure that managers really serve the long-term interests of shareholders. This is not because board members cannot be bothered to do their job. Most big companies today work their board hard. Even so, board members rarely challenge the chief executive. If they do, they are often asked to resign, and usually oblige. As Enron showed, board loyalty may be encouraged with all sorts of incentives, including donations to favourite charities or consulting contracts. But even without such sweeteners, boards seem to have a natural inclination to turn into clubs, and nobody wants to upset the club president.
Divide and rule
Outside America, things are done somewhat differently. British boardrooms, for example, usually have a chairman, typically a non-executive, to balance the influence of the chief executive and run the board meetings. So the opinions being voiced can be more diverse, and the chief executive does not always get his way.
One idea for making boards more responsible is to hold them properly to account when things go wrong. Generally, board members face no financial penalties if they mess up because the company buys insurance for directors and managers. After Enron's collapse, Paul O'Neill, America's treasury secretary, floated the idea of asking chief executives to sign a financial-health statement that would make them liable for misrepresentations, whether deliberate or not. But the likely effect would be to make it impossible to get anybody to serve on a board. Already, the increased demands of board membership are discouraging chief executives from becoming non-executives elsewhere, says John Whitehead, a former boss of Goldman Sachs and a member of numerous boards over the years. “The risk is you will just get dignitaries who could use the $40,000 fee, college principals, public figures and the like. Boards may look socially responsible, but they won't act as a policeman.”
The big challenge is to ensure that two board committees—compensation and audit—do their job properly. In Britain, the Financial Services Authority has issued tough guidelines for the chairmen of audit committees of financial firms that might usefully be extended to other companies. But unless the chairmen of these committees are full-time, are able to hire their own professional advisers and, ideally, are nominated directly by shareholders, they are unlikely to have the knowledge and independence to be effective watchdogs, reckons Bob Monks, a veteran shareholder activist. He is not hopeful. Failing that, the best way of getting boards to work effectively is for chief executives to encourage robust debate and a culture of accountability. Alas, it is a rare boss who has so enlightened a sense of self-interest.
If a compensation committee were working as it should, what would it do? For a start, it would reward only genuinely superior performance. If a firm's share price goes up for extraneous reasons—a fall in interest rates, say, or a rise in the stockmarket—why should the managers benefit? Rewards linked to a company's share price should probably be triggered only if the firm outperforms the market as a whole, or an industry peer group. And share options should not, as a rule, be repriced at lower levels if the firm's share price falls.
It may not be necessary to stop using share options (though actual shares are probably a purer incentive and have shown themselves to be effective in motivating managers). However, they do provide an incentive to boost the share price in the short run, which may not be in the company's best long-term interest. One way to remove that incentive is to prevent the manager from selling the shares until some time after he has left the company, say three years. That is a long enough period for any trickery done on his watch to come to light. This need not do much damage to the manager's finances; a bank would be happy to extend a loan secured against the locked-up shares, provided it did not think their value had been artificially puffed up.
The biggest problem is to persuade the members of the compensation committee to care at least as much about rewarding the company's owners as they do about rewarding the chief executive who appointed them. The best answer may be to let the owners themselves vote on managers' compensation, especially options. Such a scheme is now being introduced in Britain. Mr Pitt of the SEC has proposed similar measures in America.
Before Enron's collapse, nobody much cared about audit committees or auditors. Now both are under fire. Strikingly, audit committees' most common response to growing scrutiny is to cover their backs. Many audit-committee reports this year have come with disclaimers to say that the accuracy of the firm's accounts are not their responsibility.
If anybody is going to take responsibility for a firm's accounts, it should be the external auditor. Following Andersen's humiliation at Enron, this duty is now being taken much more seriously. Yet serious conflicts of interest remain for audit firms that continue to do consulting work for audit clients. Andersen, notoriously, earned more from providing Enron with non-audit services than from the audit.
Given the crucial importance of the audit, everything possible ought to be done to eradicate any conflict of interest that might reduce effectiveness. Non-audit work for audit clients should clearly be prohibited. It would also be wise to introduce mandatory rotation of auditors after, say, five years, to stop auditor and client becoming too cosy.
Every crash has its villains, and this time public enemy number one is the Wall Street research analyst. Supposedly, analysts are another force for good corporate governance, putting pressure on management by providing investors with independent analysis of a firm's accounts and prospects. In practice, it seems, they often simply touted shares on behalf of the investment bank that employed them. This was particularly true of shares sold in IPOs. Investment banks earned huge sums of money from underwriting IPOs, and from other business relationships with companies. They typically earned little or nothing from selling research. No wonder the researchers often bowed to the investment bankers' demand for a buy recommendation to keep client firms happy.
According to the Boston Consulting Group, the potential for such tainted research was greatest in the technology, telecoms and financial-services industries, which contributed the lion's share of investment-banking revenues. As chart 6 shows, firms in these sectors had the largest number of analysts carrying out “research” into them.
Wall Street is worried that Congress will impose new regulations along Glass-Steagall lines to stop underwriting firms selling research. Erecting a new legal barrier of this kind might be a mistake, not least because to some extent this problem is curing itself. The IPO business is comatose and shows no sign of returning to the level of activity seen in the late 1990s. Investment banks are writing all sorts of new rules supposed to ensure the independence of their research, or at least give that impression. Examples include bans on analysts trading in the shares of companies they cover, disclosure of any investment-banking relationships with a company, and even making the occasional “sell” recommendation. Morgan Stanley has abandoned its system of buy and (rarely) sell recommendations for a set of ratings that offer only relative, not absolute advice.
Prudential Securities got out of investment banking altogether to prove its research is not biased. This is a brave move, because independent research firms have so far struggled to persuade anybody to pay for their work. Perhaps nobody really believes that having good research will help them to make money in the stockmarket.
In a speech earlier this year, Peter Fisher, a deputy secretary in America's Treasury, urged insurance companies and other institutional investors to get more involved in overseeing the management of the companies they invest in. Enron had highlighted the potential cost of neglecting to do so. “Corporate governance should be your risk-management programme for the next ten years,” he said. But will they take any notice?
So far, institutional investors in America, who own so many shares that nobody could argue with them, have been shockingly indifferent to bad management. If they did not like what they saw in a firm, many simply took the old Wall Street walk and sold their shares. Even index funds, which did not have the option of selling, mostly did nothing to call underperforming firms to order. There were a few honourable exceptions, mainly public-sector pension funds—though even the most active of them all, CalPERS, failed to spot trouble coming at Enron; worse, it invested in one of its notorious off-balance-sheet partnerships.
Why is everybody being so discreet? Many of the biggest fund-management companies are hoping to win investment mandates from corporate pension funds and 401(k) plans, so they do not want a reputation for being troublemakers. Smaller funds may think they do not carry enough weight to make a difference, and that their time would be better spent on other things. Some may feel they lack the expertise to become involved in such complex matters. Robert Litan of the Brookings Institution, a think-tank, reckons there might be a market opportunity for a new firm that advises institutional investors on corporate-governance matters, ideally involving well-known public figures with solid reputations. Instead of trying to save Andersen from bankruptcy, perhaps Paul Volcker, a former Fed chairman, would have made better use of his energies by starting such a business.
Public pension funds started to take a greater interest in corporate governance in the mid-1980s after the government had told them that it was their legal duty to vote their proxies. The SEC recently issued a letter instructing mutual-fund companies that they also have a duty to vote proxies, which may trigger more activity from that quarter.
John Bogle, the former boss of Vanguard, the world's biggest manager of stockmarket index funds, recently proposed the launch of a federation of long-term investors, to cover index funds and other institutional fund managers which rarely sell shares. Just six firms between them hold some $1.4 trillion-worth of shares, around 10% of all shares outstanding. Such a federation would promote better corporate governance in order to boost long-term share values, says Mr Bogle.
A recent study of the relationship between corporate governance and equity prices in 1,500 firms in the 1990s found that better governance was correlated with higher returns. A strategy of buying shares in companies with good governance and selling the rest would have produced well-above-average results.
In the past, American capitalism has shown a remarkable ability to learn from its mistakes and emerge from them even stronger. The 1929 crash prompted the passing of tough investor-protection laws that greatly improved the quality of the financial markets. After America's savings-and-loan crisis and related property debacle of the late 1980s and early 1990s, the banking system was recapitalised and its risk management much improved. Perhaps now it is the turn of American shareholders to revitalise capitalism, by ensuring that the greed of their managers works with them, not against them.
This article appeared in the Special report section of the print edition
2 July 2004
John F Schumaker takes on the philosophers of greed.
CHINESE philosopher Lao Tzu wrote 2,500 years ago: ‘There is no calamity greater than lavish desires, no greater guilt than discontentment and no greater disaster than greed.' If he's right, we've concocted a mighty sick world for ourselves. The infamous ‘greedy Eighties' turned out to be a mere dress rehearsal for one of the most spectacular greed surges in history, with jaw-dropping degrees of stockmarket folly, corporate skullduggery, decadence, excess and high-octane narcissism. But, just as with the ‘lessons of the Eighties', the ‘lessons of the late Nineties' fall on deaf ears. The overriding lesson seems to be that greed is sweet for the economy.
As human beings continue to be reshaped by consumer culture into restless, dissatisfied, and alldesiring economic pawns, greed is being redefined as a virtue and a legitimate guiding principle for economic prosperity and general happiness. In the process, it is steadily eating away at the cornerstones of civilized society and undermining the visions, values and collective aspirations that made us strong.
However in his essay ‘The Virtue of Greed', Walter Williams, an economics professor at George Mason University, maintains that without greed, our current economic and social structures would implode. He echoes the view of many economists in saying ‘greed produces preferable economic outcomes most times and under most conditions'. Many economic rationalists agree that greed's proven superiority as the psychological launchpad for economic activity is due to its being the only consistent human motivation. Most alternatives have revolved around altruism, and failed. Even the respected economist Lester Thurlow, in an essay entitled ‘Market Crash Born of Greed', holds that ‘altruism does not seem to be congruent with the way human beings are constructed. No one has been able to construct a society where communal altruism dominates individual greed.'
When we salute all-consuming America as the standout ‘growth engine' of the world, we are in many ways paying tribute to the economic wonders of greed. William Dodson's essay ‘A Culture of Greed' chronicles America's pre-eminence as a greed economy. He writes that the US enjoys a relative absence of constraints, including tax and labour constraints that would otherwise burden corporations with a sense of social responsibility, plus various system advantages and historical traditions, that together allow greed to flourish and be milked for purposes of profit and growth.
Jay Phelan, an economist, biologist, and co-author of Mean Genes, feels that greed could be our ultimate undoing as a species. Yet he theorizes that evolution programmed us to be greedy since greed locks us into discontent, which in turn keeps us motivated and itchy for change. In the past at least, this favoured survival. Conversely, he believes, it would be disastrous if humans lacked greed to the extent that they could achieve a genuine state of happiness or contentment. In Phelan's view, this is because happy people tend not to do much, or crave much – poison for a modern consumer economy.
Recent years have seen the publication of a wide range of studies casting doubt on whether economic models aimed at increasing personal wealth and consumption are actually conducive to human happiness. In fact, the large-scale General Survey of the United States found that, from the early 1970s to the late 1990s, the percentage of people who are ‘very happy' actually dropped from 34 per cent to 30 per cent, despite higher incomes, more possessions and improved living standards.
Such findings are being hailed by social critics as proof that the greed economy is toxic to well-being, and that it is hastening our slide into a collective state of ‘unhappy consciousness', as sociologists call it. But they may be missing the main point if, indeed, greed and unhappiness are the fire in the belly of a consumer economy. There is little doubt that the cultural sanctification of greed is creating a deep existential void that cannot be filled – whatever the degree of material indulgence, personal achievement or private gratification. Despite that, this ‘Empty Self' of modern life, with its insatiability and alienation, may actually be what is necessary to power greed economics.
The eminent sociologist Zygmunt Bauman writes in an insightful essay, ‘The Self in a Consumer Society', that greed itself is changing in order to better serve consumer capitalism. In the past, says Bauman, greed was not constant because people's desires were still attached to needs and objects, as well as a credible social world, which meant they tended to pause from time to time in satisfaction or reflection. Over time, however, consumer culture has upped ‘consumptive capacity' by honing its members to be immune to satisfaction, and thus immediately ready to desire the next thing that comes along. Of this, Bauman says that desire no longer desires satisfaction. In the modern age, ‘desire desires desire', which is the basis for our new ‘constant greed'.
Research is starting to show that we have come to see ourselves as incorrigibly greedy by nature. According to one survey, nearly 90 per cent of people agree with the statement ‘Humans always want more, it is part of human nature'. But in truth, a society's culture determines the extent to which our propensity for greed is activated or suppressed.
Judith Ann Johnson's groundbreaking 1999 doctoral dissertation drew the connections between maximal greed and the cultural combination of capitalism, materialism, hyper-competition and discrimination. It is the presence of all these factors that makes greed what she calls an overarching ‘map of Western consciousness'.
Another of Johnson's key findings is that greed operates best at very low levels of wisdom, awareness and understanding. It may be that the relentless dumbing down of consumer society is a valuable cultural strategy that paves the way to ever more efficient greed economics.
One specific way that greed sparks the modern economy is by suppressing savings rates via unending craving for all things consumable, which translates into frivolous spending and a hearty appetite for credit. There is an economic formula, made famous by financial legend and greed guru Leon Levy, that states: ‘For every 1 per cent rise in savings, corporate profits fall by 11 per cent.' This means, for example, if greed-inspired overspending in the US would ease to the extent that savings rose to a modest 5 per cent from the current subzero mark, corporate profits would fall by 50 per cent or more.
Greed is the backbone of the prevailing ‘philosophy of more' that supports the profitable ‘big is beautiful' trend (as with ‘mini-mansions', four-wheel drive SUVs, and so on) as well as the worldwide ‘investment-driven' property boom/ bubble. The gluttonous aspect of greed-mindedness carries further short-term advantages by way of increased tendencies toward overconsumption, waste, premature disposal and replacement, needless upgrading and general disregard for conservation.
Greed drives entrepreneurial investment. It also facilitates the manufacture and commercial exploitation of false needs. It is no wonder that greed enthusiasts insist that nothing can beat greed when it comes to the economy, and that we should not give up on it as the epicentre of economic and social life, or fixate on burst stock-market bubbles, or sticky- fingered Enrons and Worldcoms.
Peter Catsimpiris, co-founder of the pro-capitalist Laissez Faire League, even scolds us in his essay ‘In Defense of Greed' for stunting our children's greed potential with commands such as ‘Give some to the other children'. To unleash the power of greed, he says, we should teach them that greed is the great hope of humanity from which can spring boundless prosperity, progress and innovation.
But others point out how greed, and its ‘dying with the most toys' cultural hero system, is infusing children around the globe with selfdestructive degrees of materialism, avarice and self-preoccupation. The commercialization of childhood is being led by greedy corporations that put profits before social responsibility and children's health. Over the past two decades, for example, aggressive advertising by the soft-drink industry has seen high-sugar soft-drink consumption double in children aged 6 to 11, a major contributor to the worsening epidemic of childhood obesity and diabetes. Today, with greed still their main moral compass, these companies market an ever-expanding array of caffeinated drinks to children that have health experts worrying about a new wave of youth addiction.
The globalization of greed is being facilitated by agencies such as the World Trade Organization, whose mission it is to eliminate obstacles to the proliferation of transnational corporate activity, but which in effect merely pumps up corporate profit sheets at the expense of workers' rights, local environments and communities.
The notion that individual greed can serve the common good has wormed its way into political philosophies, even some with long-standing socialistic leanings. The ultimate expression of this illogic can be seen in the current US administration of George W Bush, which pins most of its hopes on a government-by-greed strategy. But, as antigreed psychologist Julian Edney argues, there is a fundamental flaw in this method as evidenced most conspicuously in the ever-widening gap between rich and poor: ‘Greed demolishes equity. Simply, you cannot have both unrestrained greed and equality.'
In the end, unchecked greed erodes freedom, undermines the social fabric and is an undemocratic force
According to Edney, the celebration of greed has spawned a ‘schizophrenic haze' that numbs society to the tragic and dangerous consequences of the present ‘apartheid economy'. In the end, unchecked greed erodes freedom, undermines the social fabric and is an undemocratic force.
More and more mental-health professionals are saying that greed is not nearly as good for people as it is for economies, with some warning that greed is beginning to overwhelm conscience, reason, compassion, love, family bonds and community. Moreover, existing levels of constant greed are causing clinical depression and despair in many people.
The term ‘pleonexia' is being used to diagnose pathological greed that can contribute to a host of ills, including stress, burnout, gambling addictions, compulsive shopping, ‘affluenza' and loss of moral grounding.
American psychologist and greed treatment specialist David Farrugia, sees greed as a mistaken, empty and shortsighted goal that contains many seeds of destruction, in particular those that destroy families and marriages. Beyond that, in his article ‘Selfishness, Greed, and Counseling', a chronic orientation toward greed has been shown to result in inflexibility, anxiety and diminished reality testing, all of which tarnish a person's overall experience of life.
Extremes of greed may even make a greed economy sick. For instance, Leon Levy feels that the greed factor in the US has actually gone too far in subduing savings and raising debt, and that consumption and the economy generally will be seriously hampered for some time to come.
Unchecked greed can also be so harmful to the environment that it comes back to haunt the economy. In fact, the single largest hitch with greed culture and greed economics is the long-term crushing effects these have on the planet. That is a monumental problem that none of today's greed enthusiasts have been able to solve.
John F Schumaker is an American-born clinical psychologist now living in Christchurch, New Zealand. His latest book is The Age of Insanity.
This article is from the July 2004 issue of New Internationalist.
You can access the entire archive of over 500 issues with a digital subscription. Get a free trial now »