Coffee Maker Alarm

This is a project which I and my classmates had finished in a course focusing on practical implementation for mechatronics and system design, and is a sequel of the previous Aroma Alarm Clock project. We designed an Android app-controllable alarm clock which can heat up objects such as a water-filled container, and intend to further improve it for making coffee, so that one can enjoy a fresh cup of coffee in the morning after awakening. Similar to the previous one, I worked as the engineer in our group, and designed and constructed the hardware system.

System Framework

Fig. 1 shows the overall system architecture, which several electronic components (e.g. clock module, LCD display…) are controlled by a microcontroller that can communicate with a mobile application using a Bluetooth module. The alarm clock can be controlled either using its own hardware input (buttons) or using the mobile app. Simply put, the system has same the functions as a typical alarm clock, but has two additional features:

1. Temperature controllable heater.
2. Bluetooth communication.

Figure 1. System architecture for the coffee maker alarm.

Hardware

Arduino Uno, DS1302 chip, HC-06 and LCD1602 are used respectively for the microcontroller, real-time clock module, Bluetooth module and LCD display module. Being an end-semester project in class, the components are simply connected using a breadboard. All the hardware components except the heater are packed using patterned cardboard in a simply looking fashion (Fig. 2).

Figure 2. Hardware appearance of the coffee maker alarm.

A ceramic heater plate and temperature sensor are integrated for realizing temperature controlling (Fig. 3), and are connected out from the main hardware. The total cost for all the components is 1,741 NTD (≅ 60 USD).

Figure 3. Heater and temperature sensor.

Software

App Inventor 2 is used as the platform for creating the Android app for controlling the hardware. Fig. 4 displays the designing interface for the mobile app using the platform.

Figure 4. Design interface using App Inventor 2.

The coding section for this platform is very unique and easy to get started. It implements a block-based programming method which developers can create procedure by dragging pre-defined blocks together to for performing a certain function. Developers with no programming background can use this kind of environment for creating mobile apps. Fig. 5 shows a gallery containing the complete code for the alarm clock controlling app.

Figure 5. Gallery of block codes for the alarm clock app.

Here’s a video demonstration for using this system:

After finishing this project, I acquired important skills for system design and mechatronics integration. This enabled me to create more interesting and sophisticated projects such as the Automated Microfluidic Controlling Platform, Remote Commandable Self-Driving Toy Car and Real-time Impedance Detection Systems.

Remote Commandable Self-Driving Toy Car

This project is an integration of computer vision, mechatronics, wireless communication (Bluetooth), database management, mobile app design, sensors and actuators, and is a final project of a course called Design of Automated Systems. I teamed up with two classmates for accomplishing this challenging task of constructing a remote commandable car that can automatically detect and find balls with different colors, then be manually navigated towards a certain location. My contributions to this project are coding the ball-tracking algorithm using OpenCV, utilizing the microcontroller Arduino for movement control, and wiring electric components to the hardware circuit (yellow shaded area in Fig. 1).

System Architecture

Fig. 1 illustrates the system architecture for this project. Raspberry Pi is used as the main computer. Node.js is installed for communicating with the MariaDB database, receiving ordering signals from a remote Android app, and commanding the ball-tracking C++ program.

Figure 1. System architecture of the remote commandable self-driving toy car.

A standard procedure for command and action of the system is as follows.

  1. Android user logins to Node.js server by verification of username and password through the database.
  2. The user sends target ball color to the server.
  3. The server sends command to “face_ball” program by bash.
  4. “face_ball” detects colored ball position by ball recognition algorithm.
  5. “face_ball” sends command to Arduino by serial USB.
  6. Arduino controls two servo motors for the car to move towards the target ball.
  7. After getting close enough, a fence is set to physically trap the ball.

Hardware Design

Fig. 2 shows a 3D drawing of the toy car. Components such as Arduino, Raspberry Pi (RPi) are fixed to the laser-cut acrylic board using plastic columns. Here, three servo motors are used. The first two controls the left/right wheel, and the third controls a trap that can lock the ball at the front of the car.

Figure 2. 3D illustration of the remote commandable toy car.

Figure 3. Preliminary version of the toy car.

Ball-Tracking Program

The main objective of the ball-tracking program is to navigate the toy car towards a target ball, and trap it with a fence connected to the toy car. The flowchart for image recognition of the ball is shown in Fig. 4.

Figure 4. Flowchart for the ball detection algorithm.

Using the above algorithm, rapid detection of the target ball can be realized. Here’s a demonstration of the program recognizing a ball being thrown in the air in real-time.

For movement control of the toy car, the Arduino will either turn left, turn right, or move straight according to the control algorithm shown in Fig. 5.

Figure 5. Flowchart for the ball-trapping algorithm.

[Source code for the ball-tracking program]

Arduino Commands

Fig. 6 shows the commands available for controlling the toy car using Arduino.

Figure 6. Available commands for Arduino.

Here’s a clip for automated control of the toy car. A red ball is defined as the target and is trapped by the car using a fence.

Android App Design

The app is designed using MIT App Inventor, which implements a block-based programming method in which developers can create procedures by dragging predefined blocks together to for performing a certain function. Fig. 7 shows an image containing the complete code for the Android app. The user interface and flowchart for direct control of the toy car is shown in Fig. 8.

Figure 7. Complete block code for the Android app (click for magnified view).

Figure 8. App interface (left) and Arduino control flowchart (right).

Here’s a demonstration for the remote commandable toy car. The car automatically detects a green ball, moves near, and traps it. Then it is manually controlled to avoid obstacles and reach a yellow-colored area at the end.

Guess the Number (iOS)

This is the first iOS game I had made using Xamarin, and is the second project of the guess the number series (after Guess the Number (Windows) and prior to Guess the Number AI). (The two-player game is also named Bulls and Cows.)

At the start of the game, a random 4-digit code is generated by the app and the player starts to guess that code. The player can restart the game anytime by pressing RESET, and a history of guesses and results are shown in a list at the bottom.

Here’s a demonstration of the app:

Considerations for app development are quite different from computer programs, such as the different screen sizes for different mobile platforms, and most of the time only a touch screen can be used. After finishing this project, I acquired some important fundamental concepts and know-hows for app design.