Future of Robots
"It’s days like today that I’m pretty sure the robot uprising isn’t happening any time soon.”
That’s what one of Blake Hannaford’s grad students told him recently after encountering some challenges in the lab. A robotics professor at the University of Washington in Seattle, Hannaford knew exactly what he meant.
“I’m never going to rule stuff out,” Hannaford, whose work focuses primarily on robotic surgery, said of potential advances. “But if you look back on science fiction from the ’50s and ’60s and compare it to today, it really missed the mark.”
In fact, you could argue, pop culture in general has ruined robots. Or at least most people’s concept of what robots actually are. According to movies and television, they’re bickering Star Wars chums R2-D2 and C3PO. They’re Star Trek’s superhuman Data and Futurama’s boozy Bender. And, of course, they’re Arnold Schwarzeneggar’s murderous-turned-virtuous cyborg in the Terminator flicks. That dude’s the biggest robo-cliché of all. Or maybe it’s RoboCop. Tough call.
It may not surprise you in the least to learn that robots are actually none of those. Most of them look nothing like humans and all of them — even the more dazzling models — are pretty rudimentary in their abilities. (Sometimes, too, they’re purposely ridiculous — like the “crappy” contraptions of Simone Giertz.)
That’s not to imply a dearth of progress. At companies and universities around the world, engineers and computer scientists are devising ways to make robots more perceptive and dextrous. More human-like in cognitive ability and, in some cases, appearance. In warehouses and factories, at fast food joints and clothing retailers, they’re already working alongside humans. This one, in Germany, can pick like a champ. They’re even starting to perform functions that have typically been the domain of humans, such as making coffee, caring for the elderly and, crucially, ferrying toilet paper. One Redwood City, California-based startup just got $32 million in Series A funding to further develop its robot waiters. And here’s a neat new schlepper-bot named Gita. They’re even proliferating down on the farm. But no matter which sector they serve, robots are far less advanced than many thought they’d be by now.
Decades ago, Hannaford said, “everyone was focused on energy, and extrapolating humans’ use of it. “[They thought], ‘A jet can fly to Europe, so in 2020 we’ll be able to go to Mars in a passenger vehicle.’”
THE FUTURE OF ROBOTS
What they missed, he went on, is that “energy didn’t scale.” Meaning that, according to Moore’s Law — a theory (now widely considered defunct) that the number of performance-boosting transistors on a computer microchip will double every two years — the cost per unit of energy failed to drop by 50 percent every 18 months decade after decade like the cost of increasingly powerful computing did.
But other factors continue to have a significant impact on computing and, consequently, robotics. Computing power per watt of electric power, for instance, is growing dramatically. In everyday terms, that means your smartphone can do more with the same battery life. It also means quicker advances in artificial intelligence — things like computer vision and natural language processing that help robots “see” and learn. The writing of more efficient software code is another way to enhance robotic performance. In a couple of decades, perhaps, robots might do most of our coding.
ROBOTS MIGHT STEAL YOUR JOB
Going forward, Hannaford said, robots will “free up people’s brains” to perform other, more complex tasks. But just as the industrial revolution displaced countless humans who performed manual labor, the robotics revolution won’t happen — and isn’t happening — “without pain and fear and disruption.”
“There’s going to be a lot of people who fall by the wayside,” he said of the countless jobs that will be automated or disappear entirely.
More than 120 million workers worldwide (11.5 million in the U.S.) will need retraining just in the next few years due to displacement caused by artificial intelligence and robots, according to a recent IBM Institute for Business Values study. Not all of them will get that retraining, of course, but the ones who do will be more apt to land new types of jobs ushered in by the robot revolution.
In a warehouse setting, for example, those who transition to other tasks that require “higher skills” such as thinking and complex movement are far less at risk of getting robo-bumped. And they will get bumped. Vince Martinelli, head of product and marketing at RightHand Robotics outside Boston, is confident that simple but prevalent jobs like warehouse order picking will largely be done by robots in 10 to 20 years. Right now, though, the technology just isn’t there.
But some experts say the more robots outperform humans, the more humans will be expected to keep up.
“As we start to compare the speed and efficiency of humans to robots, there is a whole new set of health and safety issues that emerge,” Beth Gutelius, associate director of the Center for Urban Economic Development at the University of Illinois–Chicago, told the New York Times.
That’s another argument for retraining. As authors Marcus Casey and Sarah Nzau noted in a recent Brookings Institution blog post titled “Robots Kill Jobs. But They Create Jobs, Too”: “The development of technologies that facilitate new tasks, for which humans are better suited, could potentially lead to a much better future for workers. While the widespread introduction of computers into offices certainly displaced millions of secretaries and typists, the new tasks in associated industries meant new occupations, including computer technicians, software developers and IT consultants.”
BUT HUMANS ARE STILL WAY SMARTER THAN ROBOTS
“When people see a robot do something, even if it’s a very simple task like picking things and setting them down, they immediately imagine it can do much harder things,” Martinelli said. “We get lots of questions when people are looking at a system, and we have to keep reminding them that what is simple for you and me to do is actually quite advanced.”
To more effectively drive that point home, RightHand invented a game called Pick Like a Robot that requires three people to perform a robot’s functions. One person is blindfolded and given a pair of metal tongs — they’re in charge of grabbing an item in question. Another acts as the robot’s vision system by placing their finger on whichever item they want the picker to choose. The third participant is the robot’s intelligence, responsible for guiding the picker to properly grab the item. As in robotics, the challenge is to smoothly integrate all of those systems. It is, no shock, extremely challenging.
Echoing Hannaford’s grad student, Robotic Systems Integration COO Raj Bhasin characterizes them as “just a dumb piece of hardware.” Their development, he said, is dependent on human ingenuity and advancements in AI that will imbue them with more human-like cognitive abilities that allow them to more accurately perceive, reason and learn. (Facebook, for example, has reportedly developed a reinforcement learning algorithm that lets robots navigate different internal environments sans mapping.) Once AI-driven robots can outperform or even match people in more than just simple and repetitive pre-programmed tasks, we’ll really be onto something.
“Humans have a hundred thousand years of evolution that makes us really good at tasks we take for granted,” Bhasin said at his office in downtown Chicago, where a couple of tabletop-size industrial robots were on display. “A big part of robotics is what’s called the end effector — what’s mounted to the end of a robot to grab objects. There’s a lot of mechanical engineering that goes into that aspect. How close we are to doing what a human can do depends on the object.”
Consider the difficulties encountered in Righthand’s Pick Like a Robot game and apply them to every mechanical task conceivable. And it’s not merely the task, but the speed at which that task is done. Could something like this “ultrasonic gripper” be a solution? Maybe. But presently, Bhasin said, robots are still very slow and deliberate. Even so, “we’re not going to need a hundred thousand years to make these things as capable as humans are.”
The key to making them more intelligent and more capable, he said, is reliable data that allows robots to learn more on their own and deal with constantly shifting variables, such as oddly shaped or misplaced objects, without human assistance. (As the saying goes, “garbage in, garbage out.”)
Nonetheless, Bhasin said, when it comes to the industrial automation niche his company serves, “I think maybe there’s a misconception that they’ll do much more than they actually will be able to do.”
And though they will undoubtedly increase in number year after year, it might console you to know that U.S.- and Mexico-based companies ordered fewer robots in 2019.
DRONES — THE NON-BOMBING KIND — ARE ROBOTS TOO
Like their industrial third (fourth?) cousins, commercial drones (not to be confused with bomb-dropping military drones) have been around in various forms for many decades. And though they’re constantly being improved, they’re limited performance-wise. In the U.S., these typically modest-sized UAVs (unmanned aerial vehicles) are hampered by strict Federal Aviation Administration regulations that prevent their widespread use, especially for commercial purposes, but that’s slowly changing. According to PwC, the global drone market is currently worth around $127 billion, a valuation that will only rise as adoption increases in a variety of areas, including home package delivery and medical transport.
A March of 2019 New York Times story titled, “Skies Aren’t Clogged With Drones Yet, but Don’t Rule Them Out,” noted that e-commerce drone deliveries have already been green-lighted in China. A similar scenario in the U.S., however, depends on “whether regulators eventually allow drone companies to have autonomous systems in which multiple aircraft are overseen by one pilot and whether they can fly beyond the vision of that pilot.”
One drone company doing just that is Wing Aviation LLC. It’s owned by Google parent Alphabet and helmed by CEO James Burgess, who told the Times, “scale doesn’t concern us right now. We strongly believe that, eventually, we will be able to develop a delivery service for communities that will enable them to transport items in just a few minutes at low cost.”
Besides the drones themselves, Burgess added, Wing is also working on developing an “unmanned traffic management system” to keep track of all the robotic flying machines that might someday seem as common as birds.
Then again, as drone expert James Rogers argued in a recent essay for the Bulletin of the Atomic Scientists, there are downsides to grand-scale proliferation. Today’s drones already are sparking concerns over safety and privacy. Tomorrow’s will be far better — and therefore far worse. And not merely because there might be geese-like gaggles of them buzzing to and fro.
“Think of today’s nefarious drones as the Model T of dangerous drones,” Rogers wrote. “As drone technologies grow ever more sophisticated, proliferating in an unchecked and under-regulated manner, ‘hostile drone’ incidents will increase in impact and number.”
In predicting that drones will be central to the delivery of “vital goods and services that keep a nation functioning commercially and socially,” Rogers said they’ll be regularly employed for mail delivery, law enforcement, fire response and emergency medical purposes, among other uses.
And each of those sectors, he added somewhat ominously, “will seek to harness the speed and cost-effectiveness of drones, leaving society increasingly vulnerable.”
So, there’s that.
ROBOTS THAT LOOK AND MOVE LIKE HUMANS & ANIMALS HAVE LIMITED APPEAL — FOR NOW
Outside of a factory or warehouse setting, some say it’s advantageous for robots to look more like humans. They’re where humanoids come in. You may have seen these (currently) non-sentient artificial beings tend bar and slinging six-shooters in HBO’s sci-fi drama Westworld. But their utility in real life depends on the scenario.
Over at RightHand Robotics, Martinelli said the current focus is on wider customer adoption of robots that can solve specific problems in commercial settings. Even some very impressive and sensor-packed models that can run, jump and flip — including several from Boston Dynamics — aren’t in that category. Not yet, anyway.
Boston Dynamics CEO Marc Raibert has said his long-term goal is to “build robots that have the functional levels of performance that are equal to or greater than people and animals. I don’t mean that they have to work the way that people and animals work, or that they have to look like them, just at the level of performance in terms of the ability to move around in the world, the ability to use our hands.”
Recently, the company’s robot dog Spot was made available to a handful of early clients to see how it will fare in the real world. The jury’s still out, and will be for some time. But it’s a start.
As Will Jackson, director at United Kingdom-based Engineered Arts, told BBC television, “Humanoid robots are great for entertainment and they’re great for communication. If you want something that interacts with people, the best way to do that is make something person-shaped.”
Like this invention from Agility Robotics. Dubbed “Digit” and reportedly priced in the low-to-mid six figures, it’s intended for vehicle-to-door delivery of packages weighing 40 pounds or less. Could we see armies of these things in the years ahead? Maybe. Digit hasn’t yet been tested in uncontrolled settings. And if viral YouTube videos are any indication, even a controlled environment is no guarantee of success (#robotfails).
“One of the biggest problems we have is there is nothing as good as human muscle,” Jackson explained. “We don’t come anywhere near to what a human can do. The way you will see humanoid robots is in a commercial context. So you might go into a shop and you might see a robot in there that’s trying to sell you something. Don’t worry about all the clever AI. That’s really going to stay on your computer. It’s not going to chase you up the stairs anytime soon.”
ROBOTS ARE GOING SOFT
But researchers in a newish niche called “soft robotics” are working on mimicking human motion. Developing high-performing robotic brains is incredibly difficult. Getting robots to physically react like people do is even harder, as mechanical engineer Christoph Keplinger explained during a fascinating TEDx talk in late 2018.
“The human body makes extensive use of soft and deformable materials such as muscle and skin,” he said. “We need a new generation of robot bodies that is inspired by the elegance, efficiency and by the soft materials of the designs found in nature.”
Calling biological muscle “a true masterpiece of evolution” that can heal after being damaged and is “tightly integrated with sensory neurons for feedback on motion and the environment,” Keplinger described his efforts to build artificial muscles called “soft activators” that are as versatile and adaptable as the real thing.
To that end, he and his team in Boulder, Colorado, invented something they dubbed HASEL — hydraulically amplified self-healing electrostatic actuators, which are mechanisms that control movement. Besides expanding and contracting like real muscle, Keplinger claimed, the young technology can be operated more quickly. In addition, he went on, HASEL can be adjusted to deliver larger forces for moving heavy objects, dialed down for more precise movement, and programmed to “deliver very fluidic muscle-like movement and bursts of power to shoot up a ball into the air.”
Besides being compatible with large-scale manufacturing applications, he noted, HASEL technology also could be used to “improve the quality of life” for those who need prosthetic limbs, as well as older people who would benefit from enhanced agility and dexterity.
“Maybe we can call it robotics for anti-aging,” Keplinger said, “or even a next stage of human evolution.”
WE’RE NOT READY FOR THE AUTOMATED FUTURE
To briefly recap:
- Today’s robots are pretty dunderheaded.
- Tomorrow’s robots will be less dunderheaded thanks to advancements in artificial intelligence — particularly machine and deep learning.
- Humans will be replaced by robots in some jobs and complemented by them in many others.
- New jobs will be created, providing employment opportunities for retrained workers and others who have the requisite skills.
For Hannaford, investing in education is the best way to both temper and harness the impact robots will have and increasingly are having. He lamented, however, that society does far too little of that — and therefore is woefully underprepared not only for what’s coming, but what’s happening right now. Among industrialized nations, he said, the U.S. is especially vulnerable.
“Many Americans are not equipped to earn their living in a future society where all the routine tasks are automated. That’s going to be a big, big problem. But it is ultimately solvable by raising our educational standards.”
As for the persistent notion of a post-apocalyptic hellscape patrolled by homicidal cyborgs, that’s pure fiction. Probably. What we’re living through now, and what the future holds more of, is what roboticist Ken Goldberg has described as “multiplicity.” It’s much friendlier than what’s known as “the singularity,” a point at which humans are (hypothetically) overtaken by fully autonomous and even sentient robots. In fact, Goldberg told Wired in 2018, multiplicity is “something that’s happening right now, and it’s the idea of humans and machines working together.” When you order up a car via Uber or Lyft, that’s multiplicity. Or when, down the road, you ride in a self-driving vehicle — that’s multiplicity too.
“The way we have to start thinking about robots is not as a threat, but as something that we can work with in a collaborative way,” he added. “A lot of it is changing our own attitudes.”
No comments:
Post a Comment