ECE.18 – Tonkla – Autonomous Driving System
Team Members Heading link
- Khushal Asher
- Siddharth Chatrath
- Johnnino Villaruel
- Tianxiao Ye
- Yifei Zhao
Project Description Heading link
Autonomous vehicles are a rapidly changing field and new uses are continuously being added. With the advent of the pandemic and the increase in food delivery services, the use of small scale autonomous robotic systems is expanding. We developed, iteratively tested, and implemented a multi-sensor fusion system based on the Robot Operating System (ROS), delivering outstanding functionality and exceptional user experience.
With autonomous positioning and navigation, visual tracking, and human skeleton recognition, this smart machine builds upon the foundation of other robotic technologies. Our system utilizes a range of sensors such as LiDAR and depth cameras to build a 3D map with 50cm distance from collision objects. The result is a system that can identify its location and navigate through challenging terrains, like the UIC campus, with ease. The addition of advanced visual tracking and human skeleton recognition features makes this project stand out. It can follow moving targets and recognize human motion, increasing the speed at which it can traverse and deliver items.
Our design brings about an enhanced system of safety and implementation, illustrating the future of robotics and sets a new and creative solution to autonomous transportation. With its impressive capabilities, our “smart friend” Tonkla, promises to change the ways of delivery and our interaction with autonomous robotics. We cannot wait for you to experience it!