This project is an integration of computer vision, mechatronics, wireless communication (Bluetooth), database management, mobile app design, sensors and actuators, and is a final project of a course called Design of Automated Systems. I teamed up with two classmates for accomplishing this challenging task of constructing a remote commandable car that can automatically detect and find balls with different colors, then be manually navigated towards a certain location. My contributions to this project are coding the ball-tracking algorithm using OpenCV, utilizing the microcontroller Arduino for movement control, and wiring electric components to the hardware circuit (yellow shaded area in Fig. 1).
System Architecture
Fig. 1 illustrates the system architecture for this project. Raspberry Pi is used as the main computer. Node.js is installed for communicating with the MariaDB database, receiving ordering signals from a remote Android app, and commanding the ball-tracking C++ program.
Figure 1. System architecture of the remote commandable self-driving toy car.
A standard procedure for command and action of the system is as follows.
- Android user logins to Node.js server by verification of username and password through the database.
- The user sends target ball color to the server.
- The server sends command to “face_ball” program by bash.
- “face_ball” detects colored ball position by ball recognition algorithm.
- “face_ball” sends command to Arduino by serial USB.
- Arduino controls two servo motors for the car to move towards the target ball.
- After getting close enough, a fence is set to physically trap the ball.
Hardware Design
Fig. 2 shows a 3D drawing of the toy car. Components such as Arduino, Raspberry Pi (RPi) are fixed to the laser-cut acrylic board using plastic columns. Here, three servo motors are used. The first two controls the left/right wheel, and the third controls a trap that can lock the ball at the front of the car.
Figure 2. 3D illustration of the remote commandable toy car.
Figure 3. Preliminary version of the toy car.
Ball-Tracking Program
The main objective of the ball-tracking program is to navigate the toy car towards a target ball, and trap it with a fence connected to the toy car. The flowchart for image recognition of the ball is shown in Fig. 4.
Figure 4. Flowchart for the ball detection algorithm.
Using the above algorithm, rapid detection of the target ball can be realized. Here’s a demonstration of the program recognizing a ball being thrown in the air in real-time.
For movement control of the toy car, the Arduino will either turn left, turn right, or move straight according to the control algorithm shown in Fig. 5.
Figure 5. Flowchart for the ball-trapping algorithm.
[Source code for the ball-tracking program]
Arduino Commands
Fig. 6 shows the commands available for controlling the toy car using Arduino.
Figure 6. Available commands for Arduino.
Here’s a clip for automated control of the toy car. A red ball is defined as the target and is trapped by the car using a fence.
Android App Design
The app is designed using MIT App Inventor, which implements a block-based programming method in which developers can create procedures by dragging predefined blocks together to for performing a certain function. Fig. 7 shows an image containing the complete code for the Android app. The user interface and flowchart for direct control of the toy car is shown in Fig. 8.
Figure 7. Complete block code for the Android app (click for magnified view).
Figure 8. App interface (left) and Arduino control flowchart (right).
Here’s a demonstration for the remote commandable toy car. The car automatically detects a green ball, moves near, and traps it. Then it is manually controlled to avoid obstacles and reach a yellow-colored area at the end.