A real-time facial emotion recognition system built with Raspberry Pi 4 and Arduino Uno R4 WiFi. The system uses a specialized neural network to detect emotions and synchronizes the results with a remote hardware actuator via MQTT.
The project is divided into two main nodes that communicate over the cloud:
- The Brain (Raspberry Pi 4):
- Captures images via Picamera2.
- Processes images using a MobileNetV2 (TFLite) model.
- Saves logs to an SQLite database.
- Publishes results to the HiveMQ MQTT Broker.
- The Visualizer (Arduino Uno R4 WiFi):
- Subscribes to the MQTT topic.
- Triggers physical feedback (RGB LED and Buzzer) based on the received emotion.
- Language: Python 3.x, C++ (Arduino)
- Frameworks: Flask, Flask-SQLAlchemy
- AI/ML: TensorFlow Lite, NumPy, Pillow
- Communication: Paho-MQTT, PubSubClient
- Hardware: Raspberry Pi 4, Picamera2, Arduino Uno R4 WiFi, RGB LED, Active Buzzer
- Clone the repository
- Install the required Python packages (pip install -r requirements.txt)
- Run the application (python main.py)
-
Open arduino/emotion_node.ino in the Arduino IDE.
-
Install WiFiS3 and PubSubClient libraries.
-
Update your WiFi credentials (ssid, password) in the script.
-
Flash the code to your Arduino Uno R4 WiFi.