Will a robot take your job within the next decade? The answer is absolutely, unequivocally … it depends.
Mechanization and automation have, of course, been eliminating or drastically reducing employment in various professions for more than two centuries now.
During the industrial revolution, artisans were replaced by factory workers (many of whom have more recently been replaced by industrial robots). Telephone operators (who numbered more than100,000 at their peak early in the last century) were replaced by switching systems—first mechanical, then digital.
In the 20th century, perhaps nowhere was the impact of technology on employment more dramatic than in farming. In the late 1800s, agriculture employed nearly 80 percent of American workers; by 2008, that figure had fallen to less than 2 percent. Today, farmers use GPS-guided combines that enable them to work through the night and travel up and down their fields “with a degree of accuracy of up to 2 cm to ensure the most efficient harvesting or sowing of the crop.”
More recently, technology has replaced gas station attendants and photo processing technicians, and significantly reduced the ranks of bank tellers, secretaries, and travel agents.
Yet, each new technological advance has been followed by a net increase in jobs. As openings for farm hands and telephone operators were disappearing, entire new industries were springing up, offering vast new employment opportunities: air travel and space exploration, computers and electronics, medical and environmental sciences, and a host of service industries.
The American workforce grew nearly six-fold in the 20th century, despite (thankfully) the abolition of child labor.
If technological progress has historically and consistently produced more jobs than it eliminated, why the angst now, with headlines such as this one on the BBC website, “Will machines eventually take every job?” The answer is likely a combination of several factors:
- The elimination and creation of jobs by technology isn’t perfectly synchronized. There can be periods when a new technology eliminates a large number of jobs quickly (as mechanical switching did to telephone operators), with a lag before other advancements produce a greater quantity of new opportunities. Developed countries may be in such a phase now.
- The slow recovery from the global economic crisis, which started in 2008, has left hundreds of millions workers feeling vulnerable to job-threatening forces beyond their control, from corruption to increased competition to, yes, technological advancements.
- The speed of change continues to accelerate. According the BBC article cited above, “Compared to the past, however, what is different about today is the pace at which market transformations are taking place. Aside, perhaps, from the Industrial Revolution, never before have we seen such rapid rates of societal and workforce change. While it’s too early to say for sure, data indicate that the employment market isn’t necessarily evolving fast enough to keep up with this change: the ratio of employment to the overall population has been falling in developed countries.”
- The types of jobs threatened by technology is also changing. Mechanization and digitization initially replaced, for the most part, jobs that were dirty, dangerous, boring, or repetitive. Workers moved from farming, mining, and manufacturing into white-collar and service positions.
It’s not only gas station attendants and assembly-line workers whose jobs are being threatened by the next generation of automation, robotics, and artificial intelligence (AI) technologies—it’s now skilled labor, health care professionals, and white-collar workers who may be replaced. According to recent predictions from Gartner, for example, “Writers will be replaced. By 2018, 20 percent of all business content, one in five of the documents you read, will be authored by a machine.” (Yikes!)
So, is your job at risk of being taken over by a smart machine? If so, what can you do (other than panic)?
It’s vital to understand the characteristics of jobs most at risk of being replaced, and to prepare by improving your skills in areas that are difficult to digitize.
The types of jobs most at risk are those that are repetitive, sequential, and/or subject to a set of “if-then” rules—even if those “rules” are quite sophisticated. An IBM supercomputer beat champion Garry Kasparov at chess—a game which is “the iconic representation of human intelligence”—in 1997, and computing power has increased exponentially since then. IBM is now training Watson to find personalized treatments for cancer patients.
However, there are skills where humans have an edge over machines (the cookbook Watson wrote was terrible). ZDNet recently reported that IT workers—who have a better understanding of the job-taking capabilities of smart technologies than most, and who are already seeing positions in data center management and repetitive coding tasks lost to machines—are now focusing their ongoing training efforts in several of these non-technical, uniquely human skill areas.
Interpersonal skills. As noted in the ZDnet article, “interpersonal communication as a key difference between humans and robots … Humans (who want to remain employable) will have to have empathy and caring—two traits robots can’t replicate.” Robots also have no sense of humor.
Creativity. Granted, at some level, computers can produce “art,” or music, or food (see the reference to Watson’s recipe book above), or writing (per Gartner). Though definable rules and mathematics play a role in all these types of creations, the images, sounds, tastes, and ideas conveyed can’t be reduced to code.
Computer-generated music tends be no match even for your most precocious child relative, much less a Michael Jackson, Paul McCartney, or Taylor Swift. No robot chef would come up with pineapple salsa or mole sauce (a sauce for meat, made with chocolate—really?) because those combinations of ingredients shouldn’t logically taste good (though they do).
And while there’s no “formula” for creativity, it is a learnable skill (for humans).
Serendipity. This most often refers to “happy accidents,” where one outcome was sought, but a completely different—and often better—one was achieved. Perhaps the most famous business example of serendipity was the invention of Post-It notes.
A 3M scientist seeking to create a super-strong adhesive came up with a super-weak one instead. It was viewed as a failure until two other 3M engineers figured out how to combine that adhesive with specially coated paper to create what turned out to be a phenomenally successful product. Humans can do these things. A robot simply would have dismissed the original product as a failure.
This can also refer to making connections between two disparate things in a creative way, like imaging the shape of a cheetah when designing the body for a sports car, or turning a line from an old gangster movie into a marketing campaign. Humans can do this.
Judgment. Digital technology is fundamentally binary: yes or no, on or off, if x then y. Robots can take an action or not take it, depending on specific inputs or conditions. There’s no place in the robotic brain, however, for the concept of should.
Here’s a wonderful recent story. Police in Texas recently pulled over a driver for a broken tail light and expired registration. Peering into the car, they saw three young girls in the back seat, none in (legally mandated) child car seats.
But, upon learning about the driver’s situation (the man had previously lived in his car to save money to bring his family to the area by bus, plus the family had recently been reunited and was living in a hotel), the police officers—instead of issuing citations to the dad—used their own money to purchase three car seats for the girls.
It would have been logical to simply give the driver a ticket, to enforce the law. Computers and robots are great at logic. Fortunately, in this case, the officers used the superior, and uniquely human, trait of judgment to resolve the situation.
Collaboration. The process of collaboration is fundamentally unpredictable. Experts meeting with one another may each come in with a few ideas and questions in mind, but new ideas and questions will inevitably come up as discussion of the issue at hand progresses. It’s not linear, purely logical, necessarily sequential, or rules-based—so it’s not practical for machines.
Enterprise collaboration software can be used to facilitate communication and coordination among a team, particularly when all of the participants can’t meet face to face (or even when they can, but want to keep a record of the proceedings and have a central repository for shared documents). Such software is designed to assist a team of humans in change management or problem resolution, not to replace them.
Planning. Computers work great when all possible moves and their outcomes are predictable (as in chess). Although humans remain superior at planning in conditions of ambiguity, changing conditions, shifting and competing priorities, and incomplete information (which is why even Watson works with a team of human doctors when planning cancer treatments).
Conflict resolution. While conflicts typically have some type of logical basis (e.g., which side is legally or factually right), they also involve emotion and even irrationality. Humans can navigate these elements in ways no machine can be programmed to understand.
Negotiation. As with conflict resolution, emotions play a role, the parties don’t always proceed rationally (or at least don’t appear to), and may have unstated objectives (and objections) or ulterior motives. Negotiations of even moderate complexity are difficult to reduce to rules and resistant to machine learning.
Interpretation. Computers hear words; people hear meaning. Humans are superior to machines at dealing with poorly articulated questions, partial responses, and confusing requests.
Innovation. In the words of George Bernard Shaw (and later paraphrased by Robert F. Kennedy), ”Some men see things as they are and ask why. Others dream things that never were and ask why not.”
Robots can “see” things as they are … and not much more. They certainly do not have (if they will ever have) the vision of the inventor, the innovator.
For example, robots assemble smartphones and technology plays a large role in the design of many of the components, but no robot could have envisioned or imagined the iPhone before its release in 2007. Or earlier inventions like the microwave oven or compact disc. Or, most likely, the innovations of the next several decades. That’s up to us.
Adaptation. This is the most vital skill advantage humans have over robots (at least for now). Humans remain superior to robots at changing our minds, plans, thoughts, and actions in response to changing circumstances and new information. This is the skill that will (hopefully) enable us to remain employed and employable, as robots and automation take over more occupations.
So, bottom line, will a robot take your job? That’s hard to say. According to Pew Research Center findings on this topic:
“Robotics and artificial intelligence will permeate wide segments of daily life by 2025, with huge implications for a range of industries such as health care, transport and logistics, customer service, and home maintenance. But even as (the majority of experts) are largely consistent in their predictions for the evolution of technology itself, they are deeply divided on how advances in AI and robotics will impact the economic and employment picture over the next decade.”
What’s certain is that jobs which are repetitive or rules-based will increasingly be automated or mechanized. Technical skills will be vital across job categories in the coming years, but won’t be sufficient alone. To remain viably employed into the future, you’ll need strong skills in areas like interpersonal communication, creative thinking, and collaboration.
In short, you’ll need to be able to capitalize on the one advantage you’ll always retain over robots and smart machines: your humanity.
photo credit: Robot Scrabble via photopin (license)
Post Views: 1,895