Competing with robots is making work worse (Opinion)
It is said that as work becomes more and more automated thanks to artificial intelligence, our special human traits – empathy and humor, creativity and kindness – will only become more valuable.
I wonder. What we have seen so far does not leave me optimistic.
Instead of embracing what makes us different from machines, we humans often seem to be trying to imitate them. Too many of us skip lunch, skip breaks, and work more feverishly, as if we’re just brains attached to rather inefficient and meaty hardware: bodies that (irritatingly) get sick, break down, and require regular feeding and rest. Or we try to do too many things at once — texting while driving, emailing during meetings — as if we’re a laptop that can run multiple programs instead of a human that can only focus on one thing at a time.
Downtime is a flaw in a machine, but a requirement for a human being. However, there is pressure to work faster, as if speed and quality go hand in hand. The arrival of chatbots like GPT-4 that can churn out credible text in seconds ups the ante on humans even more.
It’s as if John Henry is not just trying to outrun a steam drill, but to become one. The result is that people and their workplaces have become less patient, less civil. Less human.
But in the end, trying to imitate machines is a losing battle. “The race for IQ is being lost,” says Tomas Chamorro-Premuzic, chief innovation officer at ManpowerGroup and a professor of business psychology. “By that I mean all things that are stochastic, algorithmic, objectively solvable. We won’t be able to compete with machines.” What we need to do is rehumanize a dehumanized workplace, he argues in his new book, “I, Human.”
Chamorro-Premuzic cautions that we must resist the pull of trying to outrun the machines. We will never be faster than them; we will never be more coherent, more rational. We will never be able to work longer hours. However, as technologies have allowed us to work more efficiently, efficiency has become valued as an end in itself, even if it often comes at the expense of other values, such as creativity, thoughtfulness or generosity.
Interacting with computers so often can fuel an amnesia about how to treat other humans. Christine Porath, an associate professor of management at Georgetown University’s McDonough School of Business, has studied workplace incivility for more than two decades and has found that many of us are rude to one another. The increase precedes the Covid-19 pandemic. And I’m not just talking about a failure to say “please” or “thank you.” Porath’s research documented drivers and waitresses being scolded to tears, doctors shouting at nurses, bank tellers shooting each other. One customer even told a service representative that he hoped his wife and daughters would be raped. What’s wrong with people?
Writing about his research in the Harvard Business Review, Porath explains that stress, negative emotions, weak social ties, and a lack of self-awareness can all play a role, but so can technology, which can exacerbate these other issues. When was the last time you logged out of Twitter feeling lighter and happier? When your boss interrupts your face-to-face to check his phone, are you building your trust? Maybe customers have gotten so used to dealing with self-checkout kiosks, some have forgotten how to interact with real people.
I don’t think technology is the enemy (and even if it was, it wouldn’t go anywhere). Social media can be bad for our mental health, but FaceTime makes it easy for my son to talk to his out-of-state grandparents. Part of the solution may be to design more flexible technology systems: more “tell me how I can help you” and less “press 1”. The creators of Microsoft’s Bing Chat have released an update that will allow users to choose the attitude they want to show the bot: creative, level-headed or precise. This should make it easier for users to work with Bing. But I wonder: Will we begin to expect our human colleagues to be as flexible?
A robot is not a person, even if his apology seems genuinely sad or he sweetens his responses with emojis. We can’t – and shouldn’t – be perpetually polite like Siri, eager to please like Alexa, or self-effacing like ChatGPT.
“People have called it ‘mansplaining as a service,'” notes Chamorro-Premuzic. “But really, she’s more like a woman with imposter syndrome: She’s too humble to explain the man, she’s always apologizing or saying ‘I might be biased.’ “
So perhaps the most important lesson is that although robots are enormously cost-effective compared to humans, they also have a downside that is harder to quantify but no less real. Studies of even “empathic” robots have shown that they don’t have the positive impact on customers that real humans have, especially if the customer is already angry. Maybe it’s worth hiring humans to deal with other humans.
Inevitably, many of us will work alongside machines and we will have to improve our management of emotions. Not to spare their feelings – after all, they don’t have any – but to maintain our dignity.
Sarah Green Carmichael is an editor at Bloomberg Opinion. Previously, she was managing editor of ideas and commentary at Barron’s and executive editor at Harvard Business Review, where she hosted the “HBR IdeaCast.”