Technology has increased productivity and emancipated humankind from day-to day tasks, enabling automation over drudgery, hasn’t it? As the march of the machine continues into the digital age Chris Bell explores economist Robert Solow’s famous quip that you can see the computer age everywhere but in the productivity statistics…
It seems self-evident that information technology makes us more productive.
However, academics and technology sceptics know all about the productivity paradox. It was popularised in a 1993 article by Professor Erik Brynjolfsson, who noted an apparent contradiction between advances in computing power and the slow growth of productivity.
Let’s face it; technology’s transformative effect on business isn’t always apparent. Be brutally honest, even with all of your new devices and mobile working capabilities, how much more productive are you today than you were 10-15 years ago? Enough to justify every dollar you’ve spent on technology in the interim?
More likely your productivity is being spread more thinly over what was previously your leisure or commuting time. This is corroborated, in part at least, by responses to a Clarian Human Resources study completed in conjunction with Massey University this year, the Great New Zealand Employment Survey 2013(a nationwide online survey based on 334 responses). For example, one comment read: “Constantly accessible. People expect faster responses. Too easy to keep checking in on emails at home and on leave”. Respondents cited “excess workload” as a barrier to performance and nearly two-thirds of respondents felt IT led to spending more time on work.
It’s partly a perceptual problem, of course; just because you’re at work doesn’t mean you’re working. As the Economist’s contributor Buttonwood wryly noted, the ability to watch funny cat videos doesn’t count as increased productivity, and the same publication has been saying new technologies don’t automatically lift productivity since at least 2003: “Firms need to work out how to reorganise their business to make best use of any important new technologies before they can reap the full rewards.”
Back in 2000 Professor Robert Gordon at Northwestern University in the US wrote a paper (Interpreting the ‘One Big Wave’ in US Long Term Productivity Growth) in which he asked whether the computer and internet revolutions are as important as the first industrial (steam) and second industrial (electrical and internal combustion) revolutions. He contended many of the inventions that initially led to the deployment of computers occurred in the 1970s and 1980s and since then the majority of notable developments had been in communications and entertainment.
“But that was before the effects of the internet,” you reasonably respond. “Now we have the cloud, cheap storage, reliable web search and robust, free email!” So why aren’t these innovations unambiguously reflected in our productivity figures? Well, for one thing, says Oxford University economist Paul David, by comparison there was no notable productivity growth until at least 40 years after the introduction of electric power. It took until around 1920 for US machinery to be connected and for organisations to re-engineer themselves for the benefits of electricity. David also calculates that a technology only begins to significantly affect productivity when it has reached a 50 percent penetration rate. US computer use, for example, only reached this mark in around 2000.
This article was originally published in iStart Technology in Business magazine