Star(gazing) Wars

You’ve all heard of IQ, but what about VIQ? Visual Intelligence Quotient, that is. That’s what researchers in Pace’s robotics labs are working on in an effort to create robots that can perceive and react to their surroundings.

“We don’t have R2D2 wandering around because robots don’t really know where they are. Your computer doesn’t know it’s on your desk,” says D. Paul Benjamin, PhD, director of Pace’s robotics lab and professor of computer science at Seidenberg. “And that’s what we’re trying to do—build in the software that can give robots that capability.”

If this sounds like something straight out of a video game, then you’re right. For Benjamin, the concept of having a robot be able to understand the world around it came in part from watching iconic plumbers Mario and Luigi navigate the virtual, but working, world of Super Mario Brothers, a video game Benjamin began playing with his oldest daughter about 15 years ago.

“We’re working on a computer vision system for a mobile robot so that, as the robot moves around it will use what is essentially computer game software, like Super Mario, and the main thing about that is the software understands the physics of the world, so balls can bounce and people can’t walk through walls, and so on,” explains Benjamin.

The robotics lab, which has been a part of the University for several years, is frequented by academically exceptional undergraduate and graduate student researchers who are dedicated to the development and exploration of intelligent agents. Benjamin, who has been awarded a prestigious $300,000 research grant by the Army Research Office, is currently in the second year of his work on the visual intelligence initiative. “This is really cutting-edge research,” he says, “There are very few people around the world working on projects like this.”

Students Lin Yixia and Vinnie Monaco with a robot.

“What we’re doing is creating a system, with a pair of stereo cameras, where the robot makes a virtual copy of the world around it in real-time, so that as it moves around, it sees people moving, cars driving, and it makes a copy of it in its virtual world,” says Benjamin. “The virtual world runs like the real world and can be run faster than real-time. It can be used to predict what people are doing and where they are moving.”

The creation of an intelligent agent that is aware of its environment and the things in it and can appropriately interact with humans and objects is the ultimate goal for Benjamin and his team, although at the moment the team is just working on getting the robot to interact with people in the lab.

“We hope that by the early part of next semester to have it moving around,” Benjamin says, “We’re going to see if it can cross the street on its own—something that’s hard enough to do in Manhattan for people!”

For more information about Pace’s robotics lab and the work being done by D. Paul Benjamin, PhD, click here.