This lab is the implementation, simulation and experimental results of two ROS nodes which perform mapping and state estimation (localization) using the TurtleBot’s on-board sensors and Kinect vision system.
I will make better videos once I can get a hand on a true PC (parallels cannot handle the ROS simulation on MacOS), and sadly I did not screen capture this previously. The first video shows the TurtleBot mapping out the Student Design shop artificial arena set up by our class. The robot also performs localization, in other words estimates its position on the arena using the particle filter. It is not performing SLAM as our team did not have enough time to get that to work. In fact we ran into some transform issues between the IPS published data and out robot preventing us from doing SLAM. The second video shows the robot mapping in simulation. I attached our design documentation for people who are interested in learning how our algorithms work.Download Design Docs