Date: January 2020 - April 2022
Created By: Alex Frye, Rachel Lauer, Logan Vojta, Samuel Ryckman, Seth Snyder
Purpose: To autonomously traverse a racecourse.
Cost: ~$800
Features: ROS and fusion of multiple sensors
This bot was built and programmed for competing in the National Robotics Challenge (NRC) Autonomous Vehicle Competition (AVC). The purpose of this competition is to build and program a self-driving car that will navigate a course in the shortest time possible. Points are awarded for speed and also for successfully navigating through obstacles.
Mechanical
This robot was built on an old Traxxas RC car chassis that we had from a previous competition. This chassis is built for high speeds and difficult terrain. As such, it was extremely capable for handling the course.
The chassis was in need of repair as some parts were damaged from previous use. So the first step was replacing these components. We decided that even though we wholeheartedly trust our programming team, upgrading these parts to aluminum instead of plastic would be a good idea in case the car should ever accidentally drive into something.
Once these repairs were completed, the next challenge was coming up with a mounting solution for the electronics. Originally, the plan was to replace the old car topper with a custom 3D printed one. But due to time constraints, some maneuvering was done to allow the electronics to fit within the original plastic topper.
The electronics are mounted to 3D printed mounting plates. These are attached to an aluminum brace which is supported by the original mounts for the plastic topper. The topper is then held on by bolts which go through some 3D printed standoffs which support the electronics platform.
Electrical
This system is run primarily off of a Nvidia Jetson Nano. This is a small powerful computer single board computer. An Arduino BLE is used to interface with the various sensors and motors. Also, the BLE has a built-in IMU. Besides this IMU, the bot also relies on a GPS and a motor tachometer for measuring speed and position. A full diagram of the system is shown below.
Programming
For programming, there were two main components. The low-level functionality which would run on the Arduino and the higher-level ROS functionality to run on the Jetson. Additionally, early on in the project we were planning on implementing computer vision code with OpenCV. Eventually, it was decided we could navigate the course successfully using only the other sensors, so we dedicated our resources towards those efforts.
The Arduino’s main purpose on the bot is to control the motors and interface with the sensors. Both the main drive motor and the steering servos are controlled by PWM signals. The Arduino is connected to the tachometer and the built-in IMU. It takes values from these devices and makes them available to the Jetson. The communications between the Jetson and the Arduino are done over UART. The Arduino also regulates the speed of the main motor using a PID loop with feedback from the tachometer. The speed estimates from the tachometer are not very precise, but they are good enough for some coarse speed control.
On the Jetson side, a sensor fusion algorithm is used to combine and weight the feedback from all the various sensors. Based on the inputs, it is able to determine the car’s current state on the course. The bot has preset waypoints on the coarse and is programmed to drive from one to the next. To help nail down the correct pathfinding algorithms and controls for driving, simulations within ROS were performed.
Results
This bot was not taken to competition due to it being moved to virtual. But we were able to successfully complete a full run and record it for our submission. The run was not exactly blazing fast, but the bot did make it through all of the obstacles. A full recap can be found on the 2021 NRC Competition page.