An Assistant Who May Need the
By ANNE EISENBERG
POINT-AND-CLICK devices have long controlled computer screens. But soon they may also control some household robots that can trundle around living rooms, doing useful jobs.
One robot in development at an Atlanta laboratory is commanded by humans with an ordinary laser pointer, the same kind used by lecturers presenting slide shows. Here, though, the pointer tells a robot what to fetch. Shine its bright light on a dropped medicine bottle on the floor, and the robot will go to the spot, retrieve the bottle and roll back with it.
The robot doesn’t yet say, “Your medicine bottle, sir,” but that may also happen someday, said Charlie Kemp, an assistant professor and roboticist in the department of biomedical engineering at the Georgia Institute of Technology. He created the robot with support from graduate students and colleagues.
This dexterous robot may be especially helpful in assisting people with severely restricted mobility — for instance, those with amyotrophic lateral sclerosis, or Lou Gehrig’s disease.
Professor Kemp was inspired to create the robotic system partly because of what he had learned about helper monkeys. These animals fetch objects for quadriplegics who hold laser pointers in their mouths and shine them on items they want retrieved.
He named his one-armed robot El-E (pronounced “Ellie”), because, among other reasons, her lifting style reminded him of an elephant using its trunk.
El-E’s novel interface, the laser pointer, is important because it simplifies a tricky, longstanding problem basic to getting a robot to fetch, said Gaurav S. Sukhatme, an associate professor of computer science at the University of Southern California, who has played with El-E at the Atlanta lab.
“The pointer gives the robot just enough context and guidance to solve the really hard problem of figuring out which object among many lying around in a room to pick up,” Professor Sukhatme said. “People in artificial intelligence have been working on this problem for a long time.”
Just pointing to an object with natural gestures usually isn’t enough to direct a robot, and even when robots navigate to the right spot, it’s hard for them to grasp a particular object unless, for instance, they have a three-dimensional computer model of it, Professor Kemp said. Guided by the laser pointer, though, El-E can fetch objects as varied as towels, wallets or coffee mugs with no need for elaborate computer modeling. The laser pointer has its limits: the object being retrieved has to be in a line of sight with it. If the object is behind a bookshelf, or in the next room, the beam won’t get there.
The robot is far from a consumer item just yet; it is a laboratory prototype, about to be tested with patients at the ALS Center at the Emory University School of Medicine, said Dr. Jonathan Glass, the center’s director. The robot may fill an important need there. “I’ve had patients tell me if they drop their cellphone, they may spend several hours trying to lean down and pick it up,” Dr. Glass said. “And it fills a psychological need, too, not to have to ask for help.”
Andrew Y. Ng, an assistant professor in computer science at Stanford who is developing robot technology for people to use at home, said Professor Kemp’s use of the laser pointer was highly effective. “It is simple, elegant and clever,” he said, “one of those solutions that many of us wish we had thought of ourselves.”
El-E isn’t a biped — she rolls along on wheels. When a laser pointer illuminates a spot in the room, she detects the spot with her wide-angle camera, then trains her camera eyes on it to get the position of, say, the cellphone or book, Professor Kemp said. Then she lumbers off, her built-in laser range finder scanning across the surface for the target. Once she reaches it, a camera in her hand looks downward to get the measure of the object before she grabs it.
Another point-and-click household robot offers a two-way voice and video system that lets Mom and Dad visit with their children even when the parents are in a faraway hotel. This robot, ankle high and shaped like a disc, is connected to a home wireless network; its out-of-town owners can turn on a laptop computer and use the Internet to call the robot sitting in the living room. Then they can use the laptop’s mouse and keyboard to send the robot rolling around the room. On the computer screen, they see what the robot is seeing with its cameras, and they can talk with anyone near the robot’s sound system.
The robot, called ConnectR and not yet on the market, is being tested by its manufacturer, iRobot, said Colin Angle, chief executive. It is expected to cost about $500.
ConnectR’s camera system can show out-of-town parents the printed words in a book their children are holding at home, so they can read them a bedtime story from it. It “will allow people to visit virtually regardless of where they are in the world,” Mr. Angle said.
Copyright 2008 The New York Times Company