Projects

Next Generation Vehicle Project

Ford Fusion research vehicles

This project is a collaborative effort between the University of Michigan, Ford Motor Company, and State Farm, in which we are developing self-driving vehicles for fully automated driving. The UM side of the collaboration is between the APRIL Robotics Lab under Edwin Olson and PeRL Lab under Ryan Eustice, focusing on the planning and perception parts of the project. In this project, I primarily work on the development of robot behaviors and high-level decision making.


DDF-SAM: Decentralized Simultaneous Localization and Mapping

20 Robot simulated environment with landmarks and obstructions.
This project focuses on enabling fleets of small, cheap and disposable robots to operate in dangerous environments, such as in search and rescue scenarios in the aftermath of large scale disasters or for military battlefield reconnaissance and surveillance.

In particular, this project seeks to develop decentralized inference techniques to allow robots to cooperatively build maps of their environment by extending modern Simultaneous Localization and Mapping (SLAM) algorithms to highly distributed robot scenarios.  Because these robots operate in dangerous environments, the system must be robust to both robot and communication failure, while efficient enough to work with the limited communication bandwidth and onboard computing power
of small robots.

We introduced DDF-SAM, which extends the Smoothing and Mapping (SAM) approach for SLAM to work in a Decentralized Data Fusion (DDF) application.  The original version of the system was presented at  IROS in 2010, with extensions at ICRA in 2012 and later the introduction of DDF-SAM 2.0, which combines DDF-SAM with incremental SAM techniques with a more unified map model.  See Publications for more details.

The DDF-SAM project (BORG Project Page) is funded through the Army Research Labs as a part of the MAST Collaborative Technical Alliance.  This project is the primary work constituting my PhD dissertation.

Learning Visual Odometry Feature Detection

This project focuses on enabling a compact stereo vision rig to provide a real-time motion estimate for mobile robots, particularly under challenging conditions.  While standard visual odometry techniques using hand-engineered feature detectors work under most camera conditions, this project examines the use of learned feature detectors to handle challenge cases, such as motion blur and low lighting.

RoboCup Small Size League

RoboCup is a worldwide robotics competition with the goal of building a team of soccer playing robots capable of beating the world cup champions by 2050, and my time on the project was spent in the Small Size League (SSL), in which teams of small, fast robots play autonomous soccer with a golf ball.  This project includes both hardware and software development of a robot platform, which was completed through the student-led Georgia Tech RoboJackets team.

Interactive World Modeling

This project focuses on combining the large scale graphical inference techniques developed to solve the Simultaneous Localization and Mapping (SLAM) problem with robot planning to enable a robot to autonomously explore a cluttered environment. Because a robot not only moves in its environment, but also can directly interact with objects, we can plan actions to uncover information and resolve ambiguities in perception to achieve a robust, semantic model of the robot's environment