Emily Fitzgerald, from Boston University, built a robot that can recognize specific objects as it zooms around, and it can also avoid them to prevent collision. The robot "speaks," describing in simple sentences what is in front of it using its "eyes," which is actually an attached camera.

Looking at the robot, admittedly, it does not seem very impressive. It appears to be a two-foot-tall stack of trays rolling on wheels, oh, and it has a laptop on the top tray. But, when it stumbles upon an object, it is quite remarkable. For example, it will say, "This is a ball." A simple sentence, yes, but teaching it to recognize objects is not as easy as it seems.

“It’s almost self-thinking” in its ability to get around roadblocks, says Fitzgerald, who bestowed the "bot with a brain" as her summer 2015 project with Boston University’s Undergraduate Research Opportunities Program (UROP), which provides funding for faculty-mentored research by undergraduate students.

The robot is equipped with a deep neural network, which is a form of artificial intelligence that simulates brain neurons. In order for it to recognize an object, it has to be loaded with a large amount of data.

“There’s an algorithm that will take a ton of pictures of one object and will put it in and compile it all,” says Fitzgerald. “Then we basically assign a number to it.” From here, when the robot stumbles upon such an object, it will locate a picture that corresponds with the object and see the associate number.

"And then it will be able to use that as a reference, so it can exclaim, ‘Oh, it’s a ball,’ ‘It’s a cone,’ or whatever object I had decided to teach it,” explains Fitzgerald.

Hard work

Fitzgerald had help from Massimiliano Versace, a BU College of Arts & Sciences research assistant professor and director of BU’s Neuromorphics Lab, who oversaw Fitzgerald’s project.

The team admitted with a laugh that they had difficulties training the robot in object recognition. “There were quite a few times where we did despair a little bit that, you know, this wasn’t going to work,” says Fitzgerald, who first had to master an unfamiliar programming language. Then the team needed to make sure that the array of different software in the project would work together “without crashing the system,” she says.

There were also times wherein the software was not compatible and resulted to an uncooperative robot.

"Most of the time, it just didn’t start," Neves says, ruefully recalling those tough moments. It also could get lost. Sensors in its wheels tell the robot how far it’s traveled, but Fitzgerald notes, "the wheels weren’t moving at a constant rate, so whenever the robot would shoot off, it would think it had gone farther than it had because the wheels spun faster."

Regardless of the difficulties, this project made her realize that she wants to pursue a career in bioimaging, and focus on making robotic surgical devices that run on neural networks for human patients.

“I’ve actually taken this project and I’ve said, OK, what else can I do with it in the biomedical setting as well?” she says. “It’s really shaped how I’ve thought about my future going forward.”

Share This Article