top of page

Good Reading -- September 2013

Quoted

"I believe that the root cause of every financial crisis, the root cause, is flawed government policies." Hank Paulson, August 2013

Facts & Figures

  • Fewer than 1/3 of the companies in the Nasdaq 100 in January 2001 remain in the index today. (Source: Goldman Sachs)

  • The value of insured coastal exposure in 2012 was $1.175 trillion, up by a third since 2007. (Source: Bob Hartwig of III, in an interesting article about Hurricane Ike and hurricane risk).

Attachments

  • "Measuring the Moat" -- The latest from Michael Mauboussin. "Assessing the Magnitude and Sustainability of Value Creation."

Links

  • "The Fraud Detective" -- A great profile of Jim Chanos. Highly recommended.

  • A recent interview on China and other topics.

  • "The Amazing, Surprising, Africa-driven Demographic Future of Earth, in 9 Charts" -- The interactive charts in this article are interesting and pretty suprising.

  • "1940-2010: How Has America Changed?" -- Brief but interesting look at the 1940 census compared to the one 70 years later. Thanks to Vitaliy for passing it along.

  • "Paul Singer Hates Benchmarking" -- The title says it all, but there are also some interesting thoughts on recent performance and the market.

  • "Jeff Bezos...known for demanding management style at Amazon" -- Good profile of Amazon's CEO and the new owner of the Washington Post. Also, an interesting look at early days Amazon and Bezos.

  • Michael Price Speaking at London Value Investor Conference -- I'm several months late (this was from May), but it is excellent. Highly recommended.

  • "40 Maps That Explain the World" -- If you like maps there are some interesting ones in this article.

  • "America's Next Decade" -- More fun with maps, this time via Forbes and the author's accompanying blog post.

Articles

  • "The Long, Sorry Tale of Pension Promises" -- Roger Lowenstein with the sad truth about pensions.

  • "The Coming Green Tech Mania -- And Why It's a Good Thing" -- A brief history of financial bubbles, with some (questionable) theories as to their origins.

  • "Mutually Insured Destruction" -- An interesting look at climate change and catastrophe risk in insurance pricing.

  • "The Great Stagnation of American Education" -- Professor Bob Gordon, who last year published a notable and controversial paper declaring a new era of permanently lower growth due to less technological innovation, explains why education may be an equally big problem.

The Long, Sorry Tale of Pension Promises

How did states and cities get into this mess? It's a simple case of human frailty; where to go from here

  • By

  • ROGER LOWENSTEIN

Fifty years ago, the auto industry suffered a massive pension bust. The numbers back then were small, but pension failures are never about the numbers—they're about human frailty. People are tempted to promise more than they can deliver. Today, cities and states across the country are way behind on the promises they made to their employees. Several—including Detroit—are in bankruptcy.

Ellen Weinstein

Back in 1963, Studebaker, an independent auto maker in South Bend., Ind. was struggling to compete with the Big Three. Desperate to stay afloat, the company had increased the benefits it was promising to its retirees four times in the 1950s and early 1960s. What was desperate about this? Pension benefits aren't paid out of thin air; sponsors are supposed to set aside a sum of money proportional to the benefits that will eventually come due. If the money is invested prudently, the fund will have enough assets to meet its obligations.

Here's the rub: While Studebaker was nominally increasing benefits, it hadn't the slightest hope of making the requisite contributions. The "increases" were a fiction, but when you have no cash, promising future benefits is the best you can do, whereas raising salaries is out of the question. The United Auto Workers was complicit in this fiction. Union officials reckoned that it was better to tell the members they had won an "increase" rather than to admit that their employer was going bust.

Studebaker halted U.S. operations at the end of '63, and the company terminated its pension plan. Workers saw the bulk of their pensions go up in smoke. The loss was devastating—$15 million—and Washington didn't offer a bailout. People were shocked, though it isn't clear why. In 1950, when General Motors agreed to a pension plan, a young consultant named Peter Drucker had termed the landmark agreement a "mirage," doubting whether any company could anticipate its finances and the actuarial evolution of its workforce decades hence.

Planning wasn't the problem. Auto makers knew their pensions were underfunded—they simply preferred to spend their cash on sexy tail fins or executive bonuses. The Studebaker failure moved Congress to enact a remedy, although it took its sweet time, finally getting around to approving the Employee Retirement Income Security Act in 1974.

Erisa, as the law was known, required pension sponsors to pay annual premiums for pension insurance. It also mandated that companies actually fund their pension plans. "Mandated" is a term of art—it presumes the power to enforce. Alas, companies that found themselves in trouble tended to fall behind anyway. Over the ensuing four decades, most of the heavily unionized industries—steel, airlines, automobiles—suffered waves of bankruptcies and pension failures (now at least mitigated by insurance). Erisa provided some stability to corporate pensions, but not as much as hoped.

Public pensions—and here we come back to our current straits—replicated this behavior. Cops, firefighters and teachers had pensions well before most private-sector workers, but benefits weren't so high as to cause a problem, since government employers unilaterally set benefit levels (as well as salaries) without resorting to anything as unpleasant as collective bargaining.

By the time of the Studebaker collapse, however, matters were changing. New York City granted its employees the right to collectively bargain in 1958, and pretty soon, the genie was out of the bottle. In the 1960s, New York suffered a wave of public strikes, the resolution of which typically included pension increases. The city fathers reacted just as Studebaker's executives had. By the time Erisa was passed, New York was on the verge of bankruptcy, but Congress didn't think to deal with public plans. Cities and states could do as they pleased.

Private pensions gradually faded as an issue because many employers with pension plans failed, and newer companies (read: Google) never started them. But the problem with cities and states has mushroomed. As of last year, public plans are unfunded by a cool $1 trillion. Illinois is a poster child: $100 billion in the hole. Plans in Connecticut and Kentucky are in bad shape, ditto Chicago, Pittsburgh, the bankrupt San Bernardino, Calif., and many other cities.

The temptation for governments to negotiate unrealistic benefits was even greater than in the private sphere. Elected officials knew that, by the time benefits came due, they would be out of office. Union officials knew it, too. Once benefits were agreed to, cities and states chose to skimp on funding. Politically, it was always preferable to build the extra school or staff the additional fire station than to squirrel away more pension money.

Much has been written about the poor investing performance of public pension plans. But for all the ill-conceived speculation of Calpers (the giant California fund) and others, the real problem is that politicians across the country have failed to fund. For them, the choice between raising taxes and keeping the pension fund solvent is no choice at all.

This is a pity because, when properly run, pensions remain the best form of retirement plan. They do away with many of the risks born by individuals alone, such as outliving one's savings or retiring at the wrong time. And most people don't have the expertise to manage portfolios.

Of course, if employers don't make adequate contributions, such advantages disappear. What happens then? In the private sector, customers walk and bankruptcy results. That is also what happened in Detroit: The city's taxpayer-customers left town. But Detroit is unusual.

Most communities will not lose half their populations, and most will not seek recourse in bankruptcy. Like it or not, governments will have to find a route to solvency. That will mean reduced benefits, higher taxes or, usually, a combination of the two. In the past few years, a reform movement has begun. Governments have begun to trim benefits—a few, such as Rhode Island, quite drastically.

What's needed is to impart a sense of urgency—to convince cities and states that pension underfunding has to be dealt with now, like any other fiscal shortfall. Illinois Gov. Pat Quinn has temporarily suspended legislative salaries to pressure lawmakers to enact pension changes—an inspired move.

But if you want governments to come clean, go after their drug of choice—credit. Detroit's bankruptcy has had a salutatory effect, pushing up interest rates for other cities with pension problems. If bond markets punish localities for not funding their pension plans, politicians will not be able to look the other way.

The trouble is, the bond market's memory is short. Before we get more Detroits, or more Studebakers, the federal government should enact an Erisa (with teeth) for public employers. More simply, it could announce that local governments that fail to make timely and adequate contributions to their pension plans would lose the right to sell bonds on a tax-free basis. That would get their attention.

The point isn't to punish public retirees. The point is that, when governments make contractual promises, they ought to fund them.

—Mr. Lowenstein is the author of "While America Aged," on the pension crisis.

The Coming Green Tech Mania -- And Why It's a Good Thing

William H. Janeway

Throughout the history of capitalism, economic bubbles have been commonplace. They have emerged wherever liquid financial markets exist. The range challenges the imagination: from the iconic tulip bulb bubble, to gold and silver mining bubbles, to bubbles around the debt of newly established countries of unknowable wealth, to -- again and again -- real estate and stock bubbles.

The central dynamic is always the same: The price of a financial asset becomes detached from the real value of the economic asset it represents. So the price of dotcom shares in 1998–2000 soared out of any relationship with the underlying cash flows -- present or future -- of the startup companies striving to exploit the commercial promise of the Internet. Speculators in the financial asset can profit, even when the project they have financed fails.

Economic bubbles have also been necessary. Occasionally, the object of speculation has been one of those fundamental technological innovations -- canals, railroads, electrification, automobiles, aviation, computers, the Internet -- that eventually transforms the economy. In these cases, the prospects of short-term financial gain from riding a bubble mobilizes far more investor capital than prudent professional investors would otherwise dole out. Moreover, the very momentum of the bubble forces those careful investors to join the herd lest their relative underperformance leave them with no funds to invest: Warren Buffet, who successfully steered clear of the great dotcom/telecom bubble of 1998–2000, is the exception that proves the rule.

Economic bubbles, as everyone knows, have also inevitably burst. And the consequences can be grave or transient. When the speculation infects the credit system that fuels the entire economy -- and especially when its object offers no prospect of increased economic productivity -- the consequences of its collapse are felt mostly in the short term and are unequivocally negative, maybe even catastrophic.

But when the damage of the speculation is limited to the market for equity and debt securities, the adverse economic consequences of the bubble’s popping may be muted. Further, when the object of speculation is a transformational technology, a new economy can emerge from the wreckage. That is why, for example, the consequences of the tech bubble in 2001 were radically different from those of the housing bubble in 2008.

BOOM AND BUST

So what can we learn from the history of productive bubbles that could help us anticipate where and how (if not when) the next may emerge? Here, understanding the role of the state is singularly important. Productive bubbles have generally followed investments by the state -- that other source of financial support for projects of uncertain economic value. For example, the bonds that financed the building of the Erie Canal in the early nineteenth century were guaranteed by the state of New York. In the mid-nineteenth century, the federal government subsidized railroad construction through massive grants of public lands. At the start of the twentieth century, the government granted AT&T a monopoly on long-distance telephony in return for universal service, which helped make voice communication ubiquitous. Following World War I, as the Roaring Twenties took off, the U.S. Navy and Herbert Hoover’s Commerce Department sponsored the creation of RCA to exploit all American patents on wireless communications, thereby launching broadcast radio. Further, it was the states that made electrification possible: Their regimes of regional monopolies and price regulations enabled massive investment in expensive infrastructure. This pattern has continued into the present day; since World War II, unprecedented government investment in science has built the platforms on which entrepreneurs and venture capitalists have danced.

After each of these booms of investment, a bust followed. During the 1880s, 75,000 miles of railroad track were laid down in the United States. During the four years following the crash of 1893, more than half of that trackage was in receivership, but no one tore up the rails. Even the crash of 1929 and the ensuing Great Depression did not reverse the electrification of the American economy. And following the bursting of the dotcom/telecom bubble in 2001, the “dark fiber” that was prematurely laid down has come to be fully utilized and then some.

The government’s interventions in the market economy were not based on pure economic calculus. During the nineteenth century, the United States pursued mercantilist policies of protection and subsidies for domestic industry, as have all countries playing catch-up. The overriding mission was economic integration and coast-to-coast development: the canals and turnpikes, railroads and telephone lines were built in the name of America’s “manifest destiny” to expand across the continent.

In the twentieth century, the drive toward national development was followed by the imperative of national security. During World War II, science went to war on an unprecedented scale, yielding innovations from radar to the atomic bomb. And the commitment continued through the decades of the Cold War. From 1950 through 1978, federal government agencies accounted for more than 50 percent of all R&D spending. From silicon to software and the Internet, the entire array of information and communication technologies that we use today originated in government programs aimed at promoting national security.

State agencies not only funded scientific research; they also served as creative and collaborative customers for the products that followed. They pulled the suppliers down the learning curve to low-cost, reliable production. In other words, they rendered new technologies ripe for commercial exploitation.

Washington was not the only national capital to sponsor the computer revolution. In direct contrast with their European counterparts, however, the Defense Department, NASA, and the Atomic Energy Commission did not pick “national champions.” Rather, competition for contracts was open to such emerging players as Texas Instruments and Intel. And government agencies insisted on a transparent intellectual property regime, which created a reservoir of accessible technology that private-sector entrepreneurs could draw on in the following decades.

In the second half of the twentieth century, conquering disease came to complement national security as a motive for state investment. U.S. President Richard Nixon’s metaphor of the “war on cancer” represented more than a play on words; it invoked an open-ended commitment that transcended cost-benefit analysis, one that has underwritten the budgets of the National Institutes of Health for a generation.

By coincidence, just as the computer technologies that the federal government had fostered were maturing in the 1980s, the first wave of modern biotechnologies also came into public view. And the hugely successful initial public offerings (IPOs) of Apple Computer and Genentech in the autumn of that year marked the end of seven years characterized by an utter lack of exuberance in the stock market.

GREEN BUBBLE

Although delayed by Federal Reserve Chairman Paul Volker’s painful defeat of inflation, the “mini-bubble” in IPOs in 1983 launched the greatest bull market in the history of capitalism, culminating in the dotcom/telecom blowout at the end of the millennium. And along the way, successive IPO windows opened up for biotech startups, continuing even after the bubble burst in 2001.

Now years, even decades, of building out the new digital economy lie before us. And with that, there will be numerous opportunities for speculation-worthy innovation: further extension of the virtual social world; making the mobile and cloud computing environments safe and reliable; moving from speech recognition to natural language understanding; extracting actionable information from big data. Progress in the biosciences has comparable potential.

But what is the next domain in which state investment and speculative mania could combine to deliver another new economy? Its first manifestations have recently lit up the sky in Germany and China. Haphazard movement toward the low-carbon economy of tomorrow is already discernible.

In fact, the first bubbles of the next economy have already been generated: In recent years, the German government has offered generous subsidies to support the rapid expansion of solar panel production, which were aggressively followed by China’s own offerings. The classic pattern of stock-market speculation driving massive increases in supply, followed by price collapse and bankruptcy, has played out in both countries.

Meanwhile, the United States has tied itself into its own complicated knot. Former President George W. Bush’s Energy Policy Act of 2005 authorized a program of loan guarantees “to support innovative clean energy technologies that are typically unable to obtain conventional private financing due to high technology risks.” On taking office in January 2009, during the post–Lehman Brothers implosion of the economy, President Barack Obama substantially expanded this program as part of his stimulus plan. The success to date of one recipient, Tesla, in leveraging sufficient interest in its high-end electric cars to generate a micro-bubble in its stock and repay its loan hardly offset the political cost of writing off loans guaranteed to Solyndra (advanced solar cells) and A123 (novel battery technology) when each company went bankrupt.

Washington faced a conflict between stimulating the economy in the short run and supporting the supply of alternative sources of energy in the long run. Drawing on the history of the Pentagon’s sponsorship of digital technologies, the Department of Energy could have been more open, competitive, and transparent in multiple sources of innovative battery technology. But that would have entailed a multi-year program incompatible with the immediate need to put a floor under the collapsing U.S. economy.

At a more general level, there is no doubt that establishing the foundations for the low carbon economy will require direct and indirect support from government on a massive scale. But the ideological and institutional barriers to such commitments are formidable.

On the ideological front, the United States has now distinguished itself among the countries of the world for the number of its political leaders who deny that climate change actually exists. Institutionally, government support for clean tech and green tech is much more difficult to mobilize than was the commitment to computing in the name of national security. An enormous, profitable, and politically entrenched conventional energy industry already exists, in contrast with the nascent information processing industry of the post–World War II era. The Advanced Research Projects Agency-Energy (ARPA-E) is explicitly modeled on the fabled Defense Advanced Research Projects Agency (DARPA). But it commands a trivial amount of resources relative to DARPA’s endowment during the 1960s and 1970s. It also lacks the set of big brothers -- collectively the Department of Defense -- whose role as customers for the new technologies was transformational. The Department of Energy could have done better in sponsoring new battery technologies, but it is not clear where it would have found the source of demand to pull the innovations sufficiently close to the commercial market for the private sector to take over.

Behind the scenes and out of sight, many of the technologies that will ultimately power the next new economy have been sponsored by what the sociologist Fred Block has called the “hidden developmental state.” The most visible and economically significant of the new production technologies, hydraulic fracturing of shale hydrocarbons (fracking), was fostered by various arms of the federal government starting in the 1970s.

It is true that fracking has taken a full generation from research and experimentation to large-scale deployment. Yet on this and other fronts, the seeds of innovation have been sewn and watered. When it comes to engendering a financial mania, a wave of speculative investment at the scale necessary to construct the foundations of a new economy, what matters is a plausible story -- not the hard numbers that would satisfy rational agents of the old neoclassical economic theory. From the electric charging stations proliferating in Silicon Valley to the shale fields of North Dakota and West Virginia, by way of the busted solar bubbles of China and Germany, those stories are already beginning to accumulate.

Mutually Insured Destruction

By MAGGIE KOERTH-BAKER

In March 1947, a winter of heavy snowfall followed by a quick thaw and torrents of rain swelled rivers throughout England and Wales. Over the course of just 13 days, at least 27,000 homes and businesses were flooded. It was one of the worst natural disasters in British history. But thanks to climate change, which can prevent the thick snowpack from which spring floods draw their strength, that sort of flood may be less likely to happen today.

The seemingly inexorable (and increasingly irreversible) march of planetary warming is something we tend to associate with increased devastation — floods and famine, droughts and storms. In many cases, that’s true. But there’s a reason scientists prefer the term “climate change” to “global warming” — not everything is getting warmer. As the global average temperature rises, it alters weather systems, changing patterns of heat and cold and shifting wind currents. Risk is redistributed along with them.

No one understands risk better than the insurance industry — except, perhaps, the reinsurance industry, the companies that sell insurance to insurers, which also need protection from risk exposure. As the risk managers for the risk managers, reinsurers follow climate change obsessively. A great deal of money is at stake. If the 1947 spring floods happened today, they could cost the insurance industry as much as $24 billion.

In June of this year, the Geneva Association, an insurance research group, released a report called “Warming of the Oceans and Implications for the (Re)insurance Industry.” It laid out evidence explaining how rising ocean temperatures are changing climate patterns and called for a “paradigm shift” in the way the insurance industry calculates risk. Traditionally, insurers have predicted the future by studying the past. If your house is on a 100-year flood plain, for example, that’s because an actuary looked at historical data and calculated that there’s a 1 percent chance of your neighborhood’s experiencing a flood of a certain magnitude every year. Over the course of 100 years, that massive flood is likely to happen about once.

But the past can no longer reliably predict the future. A 2011 paper in The Journal of Hydrology suggests that the risk of spring floods associated with snowmelt in Britain will decline. That same year, a paper published in the journal Nature indicated there may be a link between climate change and an increased risk of fall flooding in Britain.

To fully grasp how our changing climate affects their downside, the insurance and reinsurance industries need new ways of modeling risk — systems that look at what’s happening now rather than what happened decades ago. That drive is leading insurance wonks to join forces with climate scientists, who might have found a solution.

While the ever-practical insurance industry has long focused on the past, climate science has, for the most part, been fixated on the far future. Scientists built computer models of virtual worlds and used them to test hypotheses about what would happen to our children and grandchildren as the planet becomes hotter.

“But for most practical decisions,” says Myles Allen, a climatologist at Oxford University, “what the world will be like in 50 years’ time is less important than understanding what the world is like today.”

A new method of statistical analysis called “event attribution,” developed by Allen, allows climate scientists to better understand how weather patterns work today. It examines recent severe weather events, assessing how much of their probability can be attributed to climate change. These impacts are so complex that isolating them would be like taking the sugar out of a chocolate-chip cookie — nearly impossible, everything is so intertwined. Event attribution tries to break through this ambiguity using brute force.

Harnessing a tremendous amount of computing power, scientists create two virtual worlds: one where the atmosphere and climate look and operate like ours does today, and one that looks more like the preindustrial world, before we started releasing greenhouse gases from factories, cars and buildings. They alter the weather in both simulated environments and see whether natural disasters play out given differing sea-ice levels, greenhouse-gas concentrations and sea-surface temperatures. They do this over and over and over, tens of thousands of times, producing an estimate of how much our altered climate affected the outcome.

It’s a slow process that requires sophisticated software, which is why it’s a relatively recent development. It took Allen and his team six years and 50,000 simulations to analyze the causes behind an episode of fall flooding in Britain in 2000. Eventually, they were able to say this: 9 times out of 10, the world with climate change had a 20 percent greater chance of experiencing those floods than the world without.

That sort of less-than-satisfying answer is common with event attribution. In 2012, Allen and his team published a paper on the heat wave that baked huge swaths of Russia in the summer of 2010. Their conclusion: that climate change made only a modest contribution, but a warmer climate had made that sort of heat wave more likely to occur in general.

It doesn’t fit well on a protest placard, but this information may one day help build better actuarial tables, translating complicated data into real-world impacts. If reinsurers expect to spend more money on losses in your region, your insurance company’s insurance gets more expensive, and your policy should, too. But it doesn’t always work that way.

Florida is a case in point. There, where some 2.4 million people live less than four feet above the high-tide line and where many U.S.-bound hurricanes are likely to pass, insurers can only use historical models to calculate risk. Climate scientists estimate that sea levels will rise anywhere between 8 inches and 6.6 feet by 2100 — enough to inundate whole neighborhoods in Miami, even on the lower end. The past offers a comfortable fiction that could limit rate hikes by writing the risk off the books.

As more groups like the Geneva Association call for risk models that account for climate change, politicians are going to get a different message. Denying climate change isn’t just foolish — it’s bad for business.

Maggie Koerth-Baker is science editor at BoingBoing.net and author of “Before the Lights Go Out,” on the future of energy production and consumption.

The Great Stagnation of American Education

By ROBERT J. GORDON

For most of American history, parents could expect that their children would, on average, be much better educated than they were. But that is no longer true. This development has serious consequences for the economy.

The epochal achievements of American economic growth have gone hand in hand with rising educational attainment, as the economists Claudia Goldin and Lawrence F. Katzhave shown. From 1891 to 2007, real economic output per person grew at an average rate of 2 percent per year — enough to double every 35 years. The average American was twice as well off in 2007 as in 1972, four times as well off as in 1937, and eight times as well off as in 1902. It’s no coincidence that for eight decades, from 1890 to 1970, educational attainment grew swiftly. But since 1990, that improvement has slowed to a crawl.

Companies pay better-educated people higher wages because they are more productive. The premium that employers pay to a college graduate compared with that to a high school graduate has soared since 1970, because of higher demand for technical and communication skills at the top of the scale and a collapse in demand for unskilled and semiskilled workers at the bottom.

As the current recovery continues at a snail’s pace, concerns about America’s future growth potential are warranted. Growth in annual average economic output per capita has slowed from the century-long average of 2 percent, to 1.3 percent over the past 25 years, to a mere 0.7 percent over the past decade. As of this summer, per-person output was still lower than it was in late 2007. The gains in income since the 2007-9 Great Recession have flowed overwhelmingly to those at the top, as has been widely noted. Real median family income was lower last year than in 1998.

There are numerous causes of the less-than-satisfying economic growth in America: the retirement of the baby boomers, the withdrawal of working-age men from the labor force, the relentless rise in the inequality of the income distribution and, as I have written about elsewhere, a slowdown in technological innovation.

Education deserves particular focus because its effects are so long-lasting. Every high school dropout becomes a worker who likely won’t earn much more than minimum wage, at best, for the rest of his or her life. And the problems in our educational system pervade all levels.

The surge in high school graduation rates — from less than 10 percent of youth in 1900 to 80 percent by 1970 — was a central driver of 20th-century economic growth. But the percentage of 18-year-olds receiving bona fide high school diplomas fell to 74 percent in 2000, according to the University of Chicago economist James J. Heckman. He found that the holders of G.E.D.’s performed no better economically than high school dropouts and that the rising share of young people who are in prison rather than in school plays a small but important role in the drop in graduation rates.

Then there is the poor quality of our schools. The Program for International Student Assessment tests have consistently rated American high schoolers as middling at best in reading, math and science skills, compared with their peers in other advanced economies.

At the college level, longstanding problems of quality are joined with the issues of affordability. For most of the postwar period, the G.I. Bill, public and land-grant universities and junior colleges made a low-cost education more accessible in the United States than anywhere in the world. But after leading the world in college completion, America has dropped to 16th. The percentage of 25- to 29-year-olds who hold a four-year bachelor’s degree has inched up in the past 15 years, to 33.5 percent, but that is still lower than in many other nations.

The cost of a university education has risen faster than the rate of inflation for decades. Between 2008 and 2012 state financing for higher education declined by 28 percent. Presidents of Ivy League and other elite schools point to the lavish subsidies they give low- and middle-income students, but this leaves behind the vast majority of American college students who are not lucky or smart enough to attend them.

While a four-year college degree still pays off, about one-quarter of recent college graduates are currently unemployed or underemployed. Meanwhile, total student debt nowexceeds $1 trillion.

Heavily indebted students face two kinds of risks. One is that they fall short of their income potential, through some combination of unemployment and inability to find a job in their chosen fields. Research has shown that on average a college student taking on $100,000 in student debt will still come out ahead by age 34. But that break-even age goes up if future income falls short of the average.

There is also completion risk. A student who takes out half as much debt but drops out after two years never breaks even because wages of college dropouts are little better than those of high school graduates. These risks are acute for high-achieving students from low-income families: Caroline M. Hoxby, a Stanford economist, found that they often don’t apply to elite colleges and wind up at subpar ones, deeply in debt.

Two-year community colleges enroll 42 percent of American undergraduates. The Center on International Education Benchmarking reports that only 13 percent of students in two-year colleges graduate in two years; that figure rises to a still-dismal 28 percent after four years. These students are often working while taking classes and are often poorly prepared for college and required to take remedial courses.

Federal programs like No Child Left Behind and Race to the Top have gone too far in using test scores to evaluate teachers. Many children are culturally disadvantaged, even if one or both parents have jobs, have no books at home, do not read to them, and park them in front of a TV set or a video game in lieu of active in-home learning. Compared with other nations where students learn several languages and have math homework in elementary school, the American system expects too little. Parental expectations also matter: homework should be emphasized more, and sports less.

Poor academic achievement has long been a problem for African-Americans and Hispanics, but now the achievement divide has extended further. Isabel V. Sawhill, an economist at the Brookings Institution, has argued that “family breakdown is now biracial.” Among lower-income whites, the proportion of children living with both parents has plummeted over the past half-century, as Charles Murray has noted.

Are there solutions? The appeal of American education as a destination for the world’s best and brightest suggests the most obvious policy solution. Shortly before his death, Steve Jobs told President Obama that a green card conferring permanent residency status should be automatically granted to any foreign student with a degree in engineering, a field in which skills are in short supply..

Richard J. Murnane, an educational economist at Harvard, has found evidence that high school and college completion rates have begun to rise again, although part of this may be a result of weak labor markets that induce students to stay in school rather than face unemployment. Other research has shown that high-discipline, “no-excuses” charter schools, like those run by the Knowledge Is Power Program and the Harlem Children’s Zone, have erased racial achievement gaps. This model suggests that a complete departure from the traditional public school model, rather than pouring in more money per se, is needed.

Early childhood education is needed to counteract the negative consequences of growing up in disadvantaged households, especially for children who grow up with only one parent. Only one in four American 4-year-olds participate in preschool education programs, but that’s already too late. In a remarkable program, Reach Out and Read, 12,000 doctors, nurses and other providers have volunteered to include instruction on the importance of in-home reading to low-income mothers during pediatric checkups.

Even in today’s lackluster labor market, employers still complain that they cannot find workers with the needed skills to operate complex modern computer-driven machinery. Lacking in the American system is a well-organized funnel between community colleges and potential blue-collar employers, as in the renowned apprenticeship system in Germany.

How we pay for education shows, in the end, how much we value it. In Canada, each province manages and finances education at the elementary, secondary and college levels, thus avoiding the inequality inherent in America’s system of local property-tax financing for public schools. Tuition at the University of Toronto was a mere $5,695 for Canadian arts and science undergraduates last year, compared with $37,576 at Harvard. It should not be surprising that the Canadian college completion rate is about 15 percentage points above the American rate. As daunting as the problems are, we can overcome them. Our economic growth is at stake.

Robert J. Gordon, a professor of the social sciences at Northwestern University, is at work on a book about the American standard of living since the Civil War.

Featured Posts
Recent Posts
Archive
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page