Version 5 (modified by 16 months ago) ( diff ) | ,
---|
First-Person View Self-Driving Car
Team: Adam Daniel Strub, Elliot Li, Mei Mei Castranova, and Divya Krishna
Directed by: Professor Rich Martin
Objective: In this project, we used specialized low-latency cameras and radios to operate remote-mode cars. The goal is to control the cars with virtual network connections and to evaluate how controllable the cars are inside a model city environment.
Abstract
The most common usages of remotely piloted systems come in the form of drones and aircraft. This project dives into the concept of remotely piloting a car from one local network. With a local network, we can control a car from anywhere as long both computers (laptop and Raspberry Pi) are on the same network.
Hardware & Software
Robotic Car: Osoyoo Servo Steer Smart Car
Computer System: Raspberry Pi 4 Model B
Camera: Runcam2, used as a USB camera
Local Area Network:
Also includes a Micro Servo Motor so the camera has a more peripheral view
Sockets
We used UDP sockets to communicate between the Raspberry Pi on the car and a laptop. With sockets, we can connect with the local virtual network. The server side would be on the Raspberry Pi running, while the client side would run on the laptop.
OpenCV and Camera
OpenCV is a Python library that allows image processing through a camera. When using the Runcam2, there were two modes: USB memory card, and USB camera. Using USB camera mode allows the display of the camera to be seen when plugged into the Raspberry Pi.
Challenges
Downloading CV2 to the Raspberry Pi
Switching the camera between the two different modes
Installing the basic code between the older and newer versions of the Raspberry Pi
Socket binding between the computer and the Raspberry Pi.
Future Goals
Create a 3D-printed platform for all the hardware to go on
Have a camera that detects objects and sends a message to the laptop
Be able to drive the car efficiently and without any errors in a complicated environment