For decades, science fiction writers have portrayed robots as sentient beings that relate to humans and live amongst them. In reality, however, robots have existed separate and apart from people — devices in corners or behind walls designed to accomplish tasks for their human beneficiaries instead of with them.
“Robots have been around for a long time … but they were mainly used for automating very repetitive and potentially harmful processes,” said Dr. Bilge Mutlu, professor of computer science, psychology and industrial engineering at the University of Wisconsin-Madison, where he is director of the People and Robots Laboratory.
“Robots were really dangerous for humans in [industrial work environments], so they were usually caged off.”
Not anymore.
Now, machines are moving out of the shadows and working side-by-side with humans in a technological dance that’s bringing more efficiency to factories and many other kinds of workplaces. Using advanced artificial intelligence (AI), modern robots can predict with increasing accuracy what their human teammates will do next, then intervene to assist them.
Indeed, robots over the past two decades have begun to break free from the cages to which Mutlu referred, allowing them to do more dynamic and synergetic work with their human counterparts. What’s making it possible for them to do so is this: The next generation of machines can read body language and facial expressions to anticipate and react to employee behavior, boosting the productivity and safety of human workers without replacing them.
In psychological terms, Emotional Intelligence (EI) described the ways in which we interact with others and manage social engagements. Some are looking to leverage AI to imbue robots with EI.
“Since collaborative robotics was introduced to the market, robots have become more flexible and safer,” explained Mutlu, who said their intelligence, size and portability could allow robots to be used for a multitude of tasks in the modern workplace.
Those tasks are only now coming into focus, according to Brad Porter, founder of Collaborative Robotics and former vice president of robotics at Amazon. Putting humans and machines together in close quarters, he said, will have a profound impact on a variety of different industries.
“If we want to see robots taking more of the burden of work in society,” Porter noted, “we need those robots to move out of structured spaces and move into logistics, hospitality, healthcare, municipal services and transportation.”
Learning Something New Every Day
AI and machine learning (ML) already are crucial to making sense of the massive quantity of images that robots ingest in real time.
“The process begins with collecting vast amounts of data on human behaviors — videos, images, audio, and sensor data capturing movements, facial expressions, gestures, and spoken language,” said Michael Kupferman, VP of R&D at Intuition Robotics, maker of the AI-driven companion care device ElliQ.
“This data is then used to ‘teach’ the robot to recognize patterns. Through supervised learning, the robot’s algorithms are trained to distinguish specific movements — like a wave or a nod — and to interpret emotions, such as a smile, a frown, or an enthusiastic tone,” he said.
“By feeding this information into neural networks, AI models learn to identify subtle changes in posture or expressions and associate these with possible human emotions or intentions.”
Natural language processing (NLP) plays a central role in training robots to detect nuances in tone, sentiment, and emphasis. “For example, if a user’s voice sounds distressed, an empathetic robot might slow down, offer reassurance, or ask if they need assistance,” Kupferman said.
AI in support of human-sensitive robots requires “a combination of sensing and machine learning with that sensor data,” Porter said. “Machine learning is being used to determine the pose or motion of a human or their facial expressions or gaze tracking.”
Thanks to that machine learning, robots are constantly improving their ability to understand their co-workers and complete both autonomous and collaborative tasks.
“There are systems that continuously learn and adapt, and the new observations that they detect can be used to further their training and optimization,” Dr. Mutlu said.
“You can also take the robot and move its arm, demonstrate the task and let it repeat that task so it will learn from that demonstration. Or you might use a video of a worker doing the task, captured from the robot’s camera. If you provide enough data to the robot, it can recognize different aspects of a person’s actions and know what it needs to do. That is when you use AI and machine learning methods to train the robot.”
Behind these important, incremental improvements is cloud computing.
“Cloud-based services are unavoidable,” Dr. Mutlu said. “They’re really becoming integral to this kind of work.”
Building Collaborative Systems Using Robots
Collaborative Robotics is one of a handful of companies building autonomous technology that can work side-by-side with humans. Some of its machines can spare workers from handling sharp objects or carrying heavy materials while orchestrating hand-offs on a production line.
“We set out to build a robot that needs to work in and around humans, so you want it to have as rich a model for what a human might be doing as you can give it so it can anticipate what they will do next,” Porter said.
Because some actions are still too dexterous, skilled or varied to be fully automated, teamwork between human and robotic workers is essential.
“Any kind of process that needs a lot of human expertise — these are very hard to automate,” Dr. Mutlu said.
“The key is bringing these robots into processes, getting them to do what they are good at while leaving people to do what they are good at, and creating a fluent workflow between the robot and human workers. We are building more collaborative systems that can capitalize on human expertise while automating the menial, dangerous and repetitive parts of a task.”
Building collaborative systems requires equipping robots with sophisticated cameras and sensors to collect timely and actionable data on the activities of their living, breathing colleagues.
“I think it is really helpful to understand body language with respect to task actions — they are getting ready to do this or that,” Dr. Mutlu said.
“It can observe a collaborator, know what their task is, predict what they’re about to do and plan its actions accordingly. You can see where a person is looking and know where their attention is, so you can time the robot’s behaviors better. This can create much more fluid collaborative tasks that are safer, more cost-effective and efficient.”
Real-world examples
The prospects for collaboration go beyond the warehouse and the manufacturing floor. Some recent real-world examples show the emerging art of the possible in AI-driven robotic empathy and AI-supported human-robot collaboration.
Science Robotics reports on a robot that has been trained with to detect a human smile — and to smile first. That could make for a more effective, more sympathetic teammate.
A company recently unveiled “a robot that doesn’t just perform tasks, but one that can interact with you on an emotional level.” Developed by SoftBank Robotics, Pepper is designed to understand and respond to human emotions.
Researchers have built a robot that can take selfies, toss a ball, eat popcorn, and play air guitar. Instead of having to program its actions, humans can simply give it verbal instructions, thanks to Natural Language Processing.
Security-industry association ASIS reports on emerging security robots, which need to support human interaction and communication. “Humanoid security robots should be able to interact with people naturally and intuitively,” they say, adding that “the AI model integrated into a security robot is crucial to its ability to perform the tasks we require with the level of autonomy and accuracy we expect.”
An example of AI-informed robotics at the edge: Elon Musk claims the humanoid robot Optimus will eventually dance, serve drinks, and do household chores.
‘A Little Like Star Wars’
The robotic evolution continues to transform industries not only by supercharging efficiency but also by protecting workers’ jobs and wellbeing. With their newfound ability to understand and react to cues from their human colleagues, machines are paving the way for a more collaborative future.
“Down the road, their capabilities will extend to completely open-ended environments where robots can help people without knowing who the person is or what their task is,” Dr. Mutlu said.
When that happens, robotic enthusiasts insist that everybody’s lives will be easier.
“We are going to increasingly see robots be a part of our everyday lives, working in and around us, picking up trash along the roadside or collecting shopping carts in a grocery store parking lot,” Porter said.
“Increasingly, these capabilities are becoming more and more available. I sometimes think of the future as being a little like Star Wars, where you see humans and robots walking alongside each other and it seems very natural.”
This is an updated version of the article originally published on February 16, 2023.
Chase Guttman is a technology writer. He’s also an award-winning travel photographer, Emmy-winning drone cinematographer, author, lecturer and instructor. His book, The Handbook of Drone Photography, was one of the first written on the topic and received critical acclaim. Find him at chaseguttman.com or @chaseguttman.
Adam Stone contributed to this updated version.
© 2024 Nutanix, Inc. All rights reserved. For additional information and important legal disclaimers, please go here.