== **Self-Driving Vehicular Project**\\ **Team**: Aaron Cruz [UG], Arya Shetty [UG], Brandon Cheng [UG], Tommy Chu [UG], Vineal Sunkara [UG], Erik Nießen [HS], Siddarth Malhotra [HS] **Advisors**: Ivan Seskar and Jennifer Shane ---- **Project Description & Goals**:\\ Build and train miniature autonomous cars to drive in a miniature city. \\ RASCAL (Robotic Autonomous Scale Car for Adaptive Learning): - Using the car sensors, offload image and control data onto a cloud server. - This server will train a neural network for our vehicle to use in order to drive autonomously given camera image data.\\ **Technologies**: ROS (Robot Operating System), Pytorch\\ [https://gitlab.orbit-lab.org/self-driving-car-2023/] [[Image(SDC Week 2.png, 690px)]] [[Image(gitlab.png, 500px)]] ---- **Week 1: 5/28 - 5/30**\\ [https://docs.google.com/presentation/d/1_K0UlIok6UAKQ4FFww5lPMrha5UKXG6zL87LEalwGbc/edit? Week 1 Slides] Progress: - Got familiarized with past summer's work:[https://gitlab.orbit-lab.org/self-driving-car-2023/upcar GitLab], RASCAL setup, [https://www.orbit-lab.org/attachment/wiki/Other/Summer/2023/OH2023/P10a.jpg Poster] - Debugged issue with RASCAL's pure pursuit ---- **Week 2: 6/3 - 6/6**\\ [https://docs.google.com/presentation/d/1-SxxC-grfdlDhFj4ZKn8_SqMMATA9l2CJ__-rpB7mYw/edit?usp=sharing Week 2 Slides] Progress: - Setup X11 forwarding for GUI applications through SSH - Visual odometry using Realsense Camera and [https://wiki.ros.org/rtabmap_ros/TutorialsOldInterface/StereoHandHeldMapping#Bring-up_example_with_RealSense_cameras rtabmap] - Streamlined data pipeline that processes bag data (car camera + control data) into .mp4 video - Detected ARUCO markers from a given image using Python & OpenCV libraries - Setup Intersection server (node with GPU) - Developed PyTorch MNIST model - Trained "yellow thing" neural network - Line up perspective drawing with camera to determine FOV ---- **Week 3: 6/10 - 6/13**\\ [https://docs.google.com/presentation/d/1t6Z2rI3c8sEDKmA4Er4-tidhTmySG4w02xafc-y60Hs/edit#slide=id.g2e30c923967_0_17 Week 3 Slides] Progress: - Created web display assassin to eliminate web server when closing ROS - Tested "yellow thing" model, great results - SSHFS setup - Calibrated Realsense camera - Created "snap picture" button on web display for convenience - Developed python script to detect ARUCO marker and estimate camera position - Tested point cloud mapping with rtabmap - Attempt sensor fusion with encoder odometry and visual odometry - Data augmentation to artificially generate new camera perspectives from existing images ---- **Week 4: 6/17 - 6/20**\\ [https://docs.google.com/presentation/d/1wfkr_YybtKg2IlEaGD3EdoBfMPnrObRBXZ5UF_F5z-k/edit?usp=sharing Week 4 Slides] Progress: - Refined aruco marker detection for more accurate car pose estimation - Trained model with video instead of images - Improved data pipeline from car sensors to server - Refined data augmentation to simulate new camera perspectives - Added more data visualization (Replayer) to display steering curve, path, and images to web server ---- **Week 5: 6/24 - 6/27**\\ [https://docs.google.com/presentation/d/1ib4H_31kK6K8KJLAAKhDrqawSpKOKh3EucpwhRRhLlc/edit?usp=sharing Week 5 Slides] Progress: - Aruco Marker Detection now updates car position within XY plane. Finished self-calibration system - Addressed normalization and cropping problems - Introduced Grad-CAM heat map - Resolved Python version mismatch issue - Visualized training data bias through histogram - Smoothed data to reduce inconsistency in training data - Simulation Camera - skews closest image to simulate new view - Added web display improvements - search commands, controller keybinds ---- **Week 6: 7/1 - 7/3**\\ [https://docs.google.com/presentation/d/1JeKKRCbCkM4q0PWf-c7rgTZLXl4UrGW_TE5X6pYmlZo/edit?usp=sharing Week 6 Slides] Progress: - Curvature interpolation and calibration - Trained model using double orange barriers - Training batches are less 0 biased - Data augmentation blur artifacts during NN normalization - solid fill - Simulated driving with ML model and image skewing - Improvements to Aruco detection: code refactoring, functionality for multiple Arucos ---- **Week 7: 7/8 - 7/11**\\ [https://docs.google.com/presentation/d/1x-spBEju3J8r6fR0IPGMLdMZKGpu03wEZ1_8nIBoSII/edit?usp=sharing Week 7 Slides] Progress: - Rascal Simulator on server - Updated CAD models for rascal - YOLO (You Only Look Once) Object detection and distance - City training data using a constrained path - Automated testing and evaluation for models - Aruco Detection Application: integrated into city, testing with pursuit loop ---- **Week 8: 7/15 - 7/18**\\ [https://docs.google.com/presentation/d/1zJWkaCOrkoTopz8FGwz8MWgj_gRL-YvsEY47UMldGI8/edit?usp=sharing Week 8 Slides] Progress: - Spline processing & model, calculates future trajectory based on best fit spline - Setting up Virtual Machine & documentation - YOLO - Cone & barrier detection - Hyperparameter optimization for model - Automation, testing, and adjustments for aruco detection ---- **Week 9: 7/22 - 7/25**\\ [Week 9 Slides] Progress: ---- **Week 10: 7/29 - 8/1**\\ [Week 10 Slides] Progress: ---- **Week 10.5: 8/5 - 8/7**\\ [FINAL PRESENTATION] Progress: ---- **Additional Resources:**