The discussion of effects of new technologies on traditional jobs and overall future employment trends is nothing new. Anyone who is fortunate enough to have experienced the effects of technology on the manufacturing sector several decades ago can attest to the eventual success and desirability of the end result on employment and the economy. Initially the introduction of robotics, numeric programming, with software that could mimic basic intelligence, sent panic through the massive houses of industrial machine tools operated by hundreds or even thousands of manual laborers. The prediction was that with new skills training and promises of huge gains in productivity the transition, while mildly disruptive to the century’s long traditions of manufacturing, would result in many more new and better jobs. The targeted blue collar worker roles would see opportunities for higher paying and higher skilled jobs.
As we enter another era of technological growth, one that sees unprecedented rapid advancements in everything digital, the discussion of technologies disruption on present and future employment rages on. The new target is white collar jobs and the recurring prediction from many theaters of thinking is that as information technology allows the automation of more and more middle-class jobs, fewer workers will be able to find work.
A report released late last year from the Oxford Martin School’s Programme on the Impacts of Future Technology attempts to quantify the extent of the threat. It concludes that 45 percent of American jobs are at high risk of being taken by computers within the next two decades. The authors believe this takeover will happen in two stages. Computers will begin to replace people in vulnerable fields such as production labor, transportation/logistics and administrative support.
The rate of replacement may lose steam because of logjams in harder-to-automate areas like engineering.
Authors refer to this as the “technological plateau” and see a second wave of computerization, dependent upon the development of good artificial intelligence being not far behind. At risk at this stage could be management, science and engineering, and even arts positions. But this rate of computerization could be influenced by issues such as access to cheap labor and regulation of new technology. “Our findings thus imply that as technology races ahead, low-skill workers will reallocate to tasks that are non-susceptible to computerization — i.e., tasks that required creative and social intelligence,” the authors write. “For workers to win the race, however, they will have to acquire creative and social skills.”
Massachusetts Institute of Technology professors Erik Brynjolfsson and Andrew McAfee have a more optimistic view. The authors of “The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies,” agree that intelligent machines will increasingly displace many workers. But they believe that work will eventually shift to new jobs that technology creates. Workers will be displaced, not replaced. It is an argument first advanced decades ago and one that proved to be more true than false, but an important difference is arising with the arrival of the second machine age.
Computers have begun to infringe on white-collar jobs and increasingly automate even professional positions. Today’s advanced technology is challenging once settled assumptions as to the limits of what software design could accomplish without human intervention and interaction. Artificial intelligence is far exceeding realms once imagined. Computers can diagnose and predict survival rates for conditions like breast cancer from x-rays and as accurately as radiologists.
At the center of the debate is the question: Will the experience of the past be replicated in the coming era or will the speed and extent of the technological advancements produce a profoundly different result? Past performance is no guarantee for the future. Will today’s advancement of extraordinary technology be different and eliminate jobs on a significant scale?
It appears that while the new digital era has ushered in profound changes in how society labors, recreates and interacts socially, only time will reveal how all this new technology will impact the way we all make our living in this world. But no matter what side of the debate you’re arguing, one fact is clear, as Brynjolfsson and McAfee tell us, “There’s never been a worse time to be a worker with only ‘ordinary’ skills and abilities to offer, because computers, robots, and other digital technologies are acquiring these skills and abilities at an extraordinary rate.”