Are expert systems and automation good enough to fill the experience vacuum left by retiring workers, or are they so good that vacated positions may never be filled? It’s not the sort of question one asks in the staff room on coffee break, especially when a third of those present will reach retirement age in the next five years. But you should, because history shows that leveraging technology to unburden labor broadens opportunities for human capital, and that should interest the ones who aren’t retiring so soon.
When the industrial revolution overcame our physical limitations, a similar polarizing debate ensued. Early 19th century textile workers known as Luddites — now a euphemism for anti-technologists — famously smashed mechanical looms for eliminating skilled textile jobs, creating the zero-sum lens through which labor would often see new technology for centuries to come. From the changes affecting farmers, coal miners, and automobile workers of the previous century to the automation now used to perform surgery and fly aircraft, the rate at which technology has displaced experience and skill has only accelerated; and every technology-enabled leap in productivity is accompanied by a similar reaction — what will happen to the knowledge, where will the jobs go, and what’s to become of the art and the judgment and the human elements we thought were indispensable?
What’s a career worth?
At the heart of the debate is nothing less than identity and self-worth. What is it to spend a career performing a role that is eventually automated or, worse yet, simplified or dumbed-down, so a person of meager qualifications and experience can perform it? Imagine a seasoned pilot pinning a medal on the chest of a drone pilot for running a gauntlet of anti-aircraft batteries and successfully completing a mission from a cushy captain’s chair in an air-conditioned bunker using a preprogrammed flight plan and automated evasive maneuvers. Is achievement as meaningful when it’s monitored on a video screen as it was when people were more in control and more at risk?
It depends on what we’re measuring. The debates that polarize us — over character, courage, and intelligence — confuse our humanity with tasks. We tend to identify so closely with the tasks we perform, with the roles we play, that we attribute more value to the actions than to the thought that went into them; and it’s the thinking, after all, that distinguishes us from the technology we design to imitate our thinking. Reducing the physical burden ought to open us to grander intellectual pursuits. It ought to free our time for better thinking, and it’s the opportunities to advance our thinking that we miss when we “walk into the future facing backward,” as a professor of mine used to say. When we focus on the old role and identify too much with what we did (facing backward), and not enough with what we could do (facing forward) by using new technology, we sell ourselves short. We dwell too much on what’s lost and not enough on what’s gained, personally and organizationally.
The brain class
If it’s true that the industrial revolution overcame our physical limitations, then it’s likely the computer age will supersede our mental ones, or at least the ones related to rote memory and repetition. IBM’s Watson computer proved that computing power today far outstrips even the greatest Jeopardy champion’s ability to store and recall facts, and there’s no going back. Machines will continue to get better at storing, managing, and evaluating information faster and more accurately than humans; and they do it tirelessly, with unlimited endurance, and no decline in performance.
What machines can’t do better than humans, at least not yet, is think. They can’t imagine, discover, or learn, except in the most linear ways. They can’t free-associate or use analogs from one discipline to solve problems in another, which means we should only be threatened if we think knowledge is finite and that creativity and imagination will eventually flicker out, for that’s the only scenario in which machines catch up with us. And yet the debate persists.
The battle lines in the debate over the impact of technology on employment are neatly drawn. On one side are the eternal optimists who believe all technology and efficiency gains are good. On the other side are the perpetual pessimists, who see technology paving the road to the end of civilization. With hindsight on their side, the optimists assure us that each job destroyed by technology is replaced by two more of greater skill and higher pay. The pessimists concede the point, but only to underscore the accelerating economic polarization of society, with a growing low-wage low-skill workforce on one end, a more concentrated, highly skilled and compensated workforce on the other end, and no one in between — the so-called disappearing middle class. Who is right, and how should we think about our duties as managers in the industrial service sector as we evaluate new technology that we know will replace traditional line jobs?
Regrettably, the answer isn’t clear because so much depends on how the constituents react to the circumstances. Some companies value cost efficiency more than knowledge and human capital. They chew up people in the relentless drive for market share and profits; and just as regrettably, that formula often pays off, with real human consequences. Other companies put people and knowledge first, priding themselves on efficiency gains achieved by homegrown innovation. The research that conclusively finds in favor of one approach over the other has yet to be done, and it may never be done because there’s plenty of evidence to suggest both formulas work. So how should managers choose?