Skip to content

zhenia0908/iot_project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Emotion AI Assistant

A real-time facial emotion recognition system built with Raspberry Pi 4 and Arduino Uno R4 WiFi. The system uses a specialized neural network to detect emotions and synchronizes the results with a remote hardware actuator via MQTT.

System Architecture

The project is divided into two main nodes that communicate over the cloud:

  1. The Brain (Raspberry Pi 4):
    • Captures images via Picamera2.
    • Processes images using a MobileNetV2 (TFLite) model.
    • Saves logs to an SQLite database.
    • Publishes results to the HiveMQ MQTT Broker.
  2. The Visualizer (Arduino Uno R4 WiFi):
    • Subscribes to the MQTT topic.
    • Triggers physical feedback (RGB LED and Buzzer) based on the received emotion.

Tech Stack

  • Language: Python 3.x, C++ (Arduino)
  • Frameworks: Flask, Flask-SQLAlchemy
  • AI/ML: TensorFlow Lite, NumPy, Pillow
  • Communication: Paho-MQTT, PubSubClient
  • Hardware: Raspberry Pi 4, Picamera2, Arduino Uno R4 WiFi, RGB LED, Active Buzzer

Clone & Setup

  • Clone the repository
  • Install the required Python packages (pip install -r requirements.txt)
  • Run the application (python main.py)

Arduino Configuration

  • Open arduino/emotion_node.ino in the Arduino IDE.

  • Install WiFiS3 and PubSubClient libraries.

  • Update your WiFi credentials (ssid, password) in the script.

  • Flash the code to your Arduino Uno R4 WiFi.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors