A View from Walter Frick
Can Software Eat the Deficit?
How software and IT can lighten the government’s obligations by improving health care and education.
In Mark Andreessen’s words, software is “eating the world.” Everywhere you look, software is transforming industries, and increasing efficiency. Well, not everywhere.
As Washington Post columnist Stephen Pearlstein noted in a recent piece, productivity gains have materialized in some areas more than others:
While the income of both sets of workers has risen, more people are now employed in the service sector while fewer are making goods. Significantly, a big price gap has opened — the prices of goods are lower than they used to be while service prices are higher.
This phenomenon is called Baumol’s cost disease, named for economist William Baumol who has recently released an updated book on the subject. The original example is classical music. Though we’ve gotten much better at producing lots of different goods, it still takes roughly as much labor as ever to put on a symphony. And yet it costs more today, because the musicians’ salaries have risen in response to higher salaries in other sectors.
The problem, as Pearlstein notes, is that the government mostly tends to be in the service business, particularly via healthcare and education. (At the federal level healthcare is far and away the bigger slice of the pie.) And so Baumol’s cost disease suggests that government will account for a growing portion of the economy, as we pay more and more for those services. As Pearlstein relates Baumol:
Given the large productivity gains in the goods producing sector, he says, we cannot only afford the higher prices for things such as health care and education, but still have plenty of money left over to pay for more food, more cars, bigger houses, more clothes and more home appliances.
To put it bluntly, there’s just no reason we should accept this. As economist Tyler Cowen recently wrote, the rising costs and stagnant productivity of sectors like medicine and education shouldn’t be thought of as inherent to those enterprises.
Specifically, I’m bullish on the role that IT can play in both healthcare and education in the coming years. Skeptics might note that the government doesn’t have the right incentives to adopt technology the way the private sector does, but that’s an oversimplification.
In healthcare, the government is in the insurance business, while private institutions are in charge of providing care. And one of the big changes in healthcare right now is the shifting incentives away from “fee for service”, by which providers are reimbursed by insurers for performing tests and the like, and toward “pay for performance,” where health outcomes are prioritized. As the incentives improve for these private healthcare institutions, health IT startups are cropping up across the country to provide new and more efficient ways of managing patients.
In education, the story is a bit different, as local and state governments are directly in charge of the provision of education at the elementary and high school levels. But at the higher ed level and beyond, software is starting to play a bigger role. New entrants like Coursera, Udacity, and Khan Academy are providing offerings that, while perhaps not yet comparable to a university degree, fit the classic “good enough” model of disruptive innovation.
In both healthcare and education, there’s reason to expect software and IT to drive significant productivity gains over the next decade. It’d be foolish for either sector to go all in on any one unproven solution, but countless experiments by startups, incumbents, and, yes, the government, will slowly point the way toward a more efficient path.
In an era of rising projected deficits - largely driven by healthcare - we simply can’t afford to let Baumol’s cost disease burden the government’s coffers. Luckily, as software continues to eat the world, there’s no reason it can’t take a bite out of the deficit, too.