Are expert systems and automation good enough to fill the experience vacuum left by retiring workers, or are they so good that vacated positions may never be filled? It’s not the sort of question one asks in the staff room on coffee break, especially when a third of those present will reach retirement age in the next five years. But you should, because history shows that leveraging technology to unburden labor broadens opportunities for human capital, and that should interest the ones who aren’t retiring so soon.
When the industrial revolution overcame our physical limitations, a similar polarizing debate ensued. Early 19th century textile workers known as Luddites — now a euphemism for anti-technologists — famously smashed mechanical looms for eliminating skilled textile jobs, creating the zero-sum lens through which labor would often see new technology for centuries to come. From the changes affecting farmers, coal miners, and automobile workers of the previous century to the automation now used to perform surgery and fly aircraft, the rate at which technology has displaced experience and skill has only accelerated; and every technology-enabled leap in productivity is accompanied by a similar reaction — what will happen to the knowledge, where will the jobs go, and what’s to become of the art and the judgment and the human elements we thought were indispensable?
What’s a career worth?
At the heart of the debate is nothing less than identity and self-worth. What is it to spend a career performing a role that is eventually automated or, worse yet, simplified or dumbed-down, so a person of meager qualifications and experience can perform it? Imagine a seasoned pilot pinning a medal on the chest of a drone pilot for running a gauntlet of anti-aircraft batteries and successfully completing a mission from a cushy captain’s chair in an air-conditioned bunker using a preprogrammed flight plan and automated evasive maneuvers. Is achievement as meaningful when it’s monitored on a video screen as it was when people were more in control and more at risk?
It depends on what we’re measuring. The debates that polarize us — over character, courage, and intelligence — confuse our humanity with tasks. We tend to identify so closely with the tasks we perform, with the roles we play, that we attribute more value to the actions than to the thought that went into them; and it’s the thinking, after all, that distinguishes us from the technology we design to imitate our thinking. Reducing the physical burden ought to open us to grander intellectual pursuits. It ought to free our time for better thinking, and it’s the opportunities to advance our thinking that we miss when we “walk into the future facing backward,” as a professor of mine used to say. When we focus on the old role and identify too much with what we did (facing backward), and not enough with what we could do (facing forward) by using new technology, we sell ourselves short. We dwell too much on what’s lost and not enough on what’s gained, personally and organizationally.
The brain class
If it’s true that the industrial revolution overcame our physical limitations, then it’s likely the computer age will supersede our mental ones, or at least the ones related to rote memory and repetition. IBM’s Watson computer proved that computing power today far outstrips even the greatest Jeopardy champion’s ability to store and recall facts, and there’s no going back. Machines will continue to get better at storing, managing, and evaluating information faster and more accurately than humans; and they do it tirelessly, with unlimited endurance, and no decline in performance.
What machines can’t do better than humans, at least not yet, is think. They can’t imagine, discover, or learn, except in the most linear ways. They can’t free-associate or use analogs from one discipline to solve problems in another, which means we should only be threatened if we think knowledge is finite and that creativity and imagination will eventually flicker out, for that’s the only scenario in which machines catch up with us. And yet the debate persists.
The battle lines in the debate over the impact of technology on employment are neatly drawn. On one side are the eternal optimists who believe all technology and efficiency gains are good. On the other side are the perpetual pessimists, who see technology paving the road to the end of civilization. With hindsight on their side, the optimists assure us that each job destroyed by technology is replaced by two more of greater skill and higher pay. The pessimists concede the point, but only to underscore the accelerating economic polarization of society, with a growing low-wage low-skill workforce on one end, a more concentrated, highly skilled and compensated workforce on the other end, and no one in between — the so-called disappearing middle class. Who is right, and how should we think about our duties as managers in the industrial service sector as we evaluate new technology that we know will replace traditional line jobs?
Regrettably, the answer isn’t clear because so much depends on how the constituents react to the circumstances. Some companies value cost efficiency more than knowledge and human capital. They chew up people in the relentless drive for market share and profits; and just as regrettably, that formula often pays off, with real human consequences. Other companies put people and knowledge first, priding themselves on efficiency gains achieved by homegrown innovation. The research that conclusively finds in favor of one approach over the other has yet to be done, and it may never be done because there’s plenty of evidence to suggest both formulas work. So how should managers choose?
How will you lead?
The answer has something to do with stewardship, a lately out-of-favor principle of leadership. Call it the corporate equivalent of the no-man-left-behind military creed. Some companies believe in it, and others don’t. It’s a cultural choice for which technology bears no blame. Technology is neither value system nor culture, but it does tend to widen individual span of control and synthesize distributed knowledge, which expands the efficiency frontiers of people and assets. It’s up to the enterprise, management, and line workers to harness technology’s power; and we have only ourselves to blame if we walk into the future facing backward and fail to imagine ourselves in new capacities, widening our span of control and increasing our productivity.
Because we associate so intimately with the limitations of past roles, we’re more inclined to see ourselves eliminated, rather than elevated by technology-driven change. It’s a natural and understandable reaction, but more nostalgic than rational. Will moving with the technology require new skills and training? Sure, but enlightened leadership makes the investment. Cultures that value knowledge and experience tend to capitalize on it rather than eliminate it, especially when knowledge is key to maintaining competitive advantage. Where knowledge and experience count, technology can be a tool, but hardly ever a substitute.
Like the choice of good corporate stewardship, which includes preserving competitiveness, the choice to evolve with the technology is an individual one for which line workers bear no less responsibility. Sadly, not everyone chooses to evolve. Fear, defensiveness, and age can all be factors. Accountability is, too. New technology often reduces the need for manpower. It can flatten hierarchies and increase transparency. Making the transition means embracing closer scrutiny, shorter chains of command, and more responsibility with higher stakes — terrific for an ambitious young Turk, but anathema to workers unwilling to change. Herein lies the forcing function that the perpetual pessimists blame for polarizing society, for eliminating the middle class.
For managers, it’s an intractable problem. Technology that reduces risk, lowers cost, and improves safety and productivity can’t be ignored without compromising the viability of the enterprise, and therefore all jobs, management included. Worse yet is the determination with which some employees cling to old practices, even though resistance to technical revolutions has been futile across the ages, and technology can elevate their importance to the enterprise.
Leadership is central to navigating the crossroads at which art and science collide. The reality is that management’s hands are no less tied than are the hands of line workers in the face of disruptive technology. The difference is that management is paid to lead, to have the answers, to set expectations, and, if the spirit moves them, to do so with a semblance of stewardship. What the enterprise and its employees can’t afford is the delegation if not abdication of leadership. There must be a concerted effort to understand the impact of disruptive forces and to guide the transition in an orderly manner to the optimal long-term benefit of the organization as a whole.
Big industrial enterprises with widely distributed operations are especially vulnerable to the failure of leadership. There was a time when decentralization and pushing decisions to front-line operators was the only way to overcome the impediments of time and distance. Decentralization was a good if imperfect way to exercise timely if inconsistent decision-making. It makes less sense now, especially where uniform best practices and standardization can simulate the collected intelligence of the enterprise and performance can be monitored everywhere all the time.
As technology brings us closer to “total situational awareness” and the prospect of re-centralization looms large, you can feel self-worth under siege. The threat, real or perceived, to autonomy and local decision-making authority can feel debilitating, even emasculating. In fact, technology-enabled visibility seems to trigger fear and suspicion, a crisis of confidence; which is fundamentally contrary to the organizing principles of a company, if you think about it.
Well-run companies are supposed to be highly coordinated endeavors in which trust is the cornerstone of teamwork that translates into competitive advantage. Does that sound like your work environment? Call it what you will — idealistic, unrealistic — but it’s the common theme among successful sports teams, the members of which are scrutinized by primetime replays and are statistically drawn and quartered by coaches and competitors alike. The common refrain heard from successful leaders such as New England Patriots football coach Bill Belichick and Herb Brooks, who steered the “Miracle on Ice” 1980 U.S. men’s hockey team to an Olympic gold medal, is that the individual is part of a system, and that the system, the team, comes first.
Commercial endeavors, industrial operations, are no different, even though the analogy to sports teams falls flat insofar as athletes personify talent and skill that can’t be automated. Nevertheless, the unifying principle holds: When we focus on individual roles to the detriment of how the enterprise must advance to remain competitive, nobody wins.
Where does that leave managers and workers in the face of disruptive, job-eliminating new technologies? How should we think about preserving the competitiveness of the system when destabilizing new factors emerge? Solutions to guide us abound, from modern warfare to substance abuse support groups. Starting with the latter, the first step of all recovery programs is acknowledging the problem. Similarly, the genesis and primary role of special warfare SEAL teams, even today, is reconnaissance — understanding and relaying the problem. Interestingly, both groups operate under high stress and rely on a flat, collaborative, open architecture mode of communication. It’s quite possible that the gravity of the problems each group confronts is the common thread in the adoption of similar models of communication, as well as the advantage they share over work-a-day routines that may mask the gravity of disruptive influences in the workplace.
Problem or opportunity?
Problem and opportunity assessment may be the least understood and most poorly exercised disciplines of modern leadership. Managers strategize, consultants analyze, leaders decide, and workers protest, usually with simmering skepticism if not outright contempt for each other. And herein lies the problem: Organizations can’t walk into the future facing forward without a clear expression of the problem, which, lest we forget, is the problem of workers and managers alike, and without an equally clear understanding of the opportunity. Running a company like a support group or a SEAL team is probably unrealistic, but demanding the honest assessments for which both organizations are renowned isn’t. In fact, it’s the only way to get organizations to turn around and face forward and to understand the risks and the opportunities. It’s the only prospect of introducing trust so individuals understand their roles in the system and recognize the all-important mission of the enterprise.
Sadly, the precedent of decentralization has introduced a range of unhealthy expectations that are hard to reverse, beginning with the autonomy of the individual, and including the habit of delegating rather than leading. Once again the enemy is culture, not technology. Ironically, technology and automation are bringing us back together in ways we never imagined and presenting challenges we can accept or reject. Roles that used to be static are changing as our lesser, repetitive skills are automated and our higher-level skills are called on to leverage and manage the technology.
As long as we focus on what’s lost, on the elimination of repetitive skills, and ignore what could be gained, the confluence of skill and automation is bound to look like a collision. Some percentage of the working census will always make that choice, hostage to individual impulses such as fear and self-preservation. Another percentage will always see the inevitability of change and recognize their own opportunity in the system in which they operate. In the middle is a group that could better serve themselves and be better served by management with clear definitions of both the problem and the opportunity. These are the swing constituents, the prospective disappearing middle class, according to the pessimists, or the ascending leadership class according to the optimists. The direction they take and the culture on which they help management settle explain a lot about the rise and fall of great companies.
Leading with vision
Fortunately, great teams — leaders and workers alike — tend to find their own ways. Lee Iacocca, for example, was fired by Ford after launching the Mustang and then landed at an ailing Chrysler organization, where he introduced the minivan and set the company on a decades-long tear of growth and expansion. Steve Jobs was fired from his own company, Apple, before founding NeXT, where he developed the operating systems that reinvented Apple after it acquired NeXT and Jobs, who saved the company from bankruptcy. Jamie Dimon was fired by his mentor, Sandy Weill, at Citigroup — then the largest U.S. financial services company — only to lead JPMorgan Chase to knock Citigroup and others from that position during the financial crisis. It was Iacocca’s relentless drive for new product development, Job’s ruthless desire for revolutionary simplicity, and Dimon’s hands-on management style that are credited with the rise of these organizations over larger, more viable direct competitors. The instinct to engage and not delegate, the disposition to change rather than accept the status quo, and the ability to keep their hands on the wheel as disruptive forces roiled the markets around them distinguish these leaders and their organizations.
|Burt Hurlock is CEO and a board member of Azima DLI, working closely with the sales, engineering, and technical services teams on strategic growth initiatives and on advancing the company’s scalable enterprise applications of machine health analytics. Hurlock has spent more than 20 years as a founder, builder, advisor, and turnaround executive for a number of venture-backed professional service businesses. He is a graduate of Princeton University and Harvard Business School. Contact him at firstname.lastname@example.org.|
Whether we are managers or workers, disruptive forces — technological or otherwise — deserve our attention. They will change our work at an accelerating pace, as they have the work of men and women for centuries. From the Encyclopedia Britannica to Kodak, companies that stumbled for lack of keeping pace are legion, while the same can’t be said for companies that adapted. Adaptation is a systematic choice, a choice by organizations as a whole to opt in. It means understanding looming change, and meeting it head on with willful and dynamic choices that reflect unifying decisions to adapt. Rapidly adapting cultures are more common to high-tech than heavy industry because the velocity of innovation in high-tech is greater. But the Internet is a common catalyst, and the change that roiled retail and media markets a decade ago are gaining traction in heavy industry today.
Concepts such as Enterprise 4.0, the Internet of Things, and Big Data all hold the promise of rapid technological advancement that can fundamentally change production management, health and safety, and asset optimization, among other fundamental aspects of industrial management. Heavy industry may be no more prepared for emerging technologies than Borders bookstores were for Amazon, but the change is coming and the impact will have material benefits for those willing to adapt, and vice versa. So the next time you’re in the staff room on coffee break, pose the following question: How will emerging technology change your job, and how will it help you and the enterprise advance? Having some imaginative answers may smooth the certain journey of your enterprise through the crossroads where skills, knowledge, and technology will soon converge in the industrial sector.