A production-grade backend for the Digital Twin of Den Bosch. It ingests real-time sensor data, stores time-series metrics, exposes REST APIs for dashboards, and provides a natural-language query interface for analytics.
Version: Runner 3.0
Actively maintained for demo and research follow-up (best-effort maintenance, no production SLA).
Quick Links: 📘 Overview · 🧭 Architecture · 🚀 Quick Start · ⚙️ Configuration · ☁️ Azure Deployment · 🩺 Operations
- Overview
- Architecture
- Core Services
- Data Flow
- Technology Stack
- Repository Structure
- Start Here
- Quick Start (Docker)
- Configuration
- Local Development
- Azure Deployment (VM + Docker Compose)
- Operations
- Security
- License
This backend consolidates environmental sensor streams (CO2, NO2, PM2.5, noise, and location metadata) into a time-series store and exposes multiple access paths:
- REST APIs for dashboards and reporting
- Natural language queries translated into database queries
- Real-time streaming for live monitoring
- Anomaly detection and alerting
- 👤 City Ops: live monitoring and incident response
- 👤 Data Analysts: historical trends and ad-hoc queries
- 👤 Platform Engineers: deployment, scaling, and reliability
flowchart LR
Sensors[Field Sensors] -->|MQTT/HTTP| Producers[Kafka Producers]
Producers -->|Kafka Topics| Kafka[Kafka Broker]
Kafka --> Consumers[Kafka Consumers]
Consumers --> Influx[InfluxDB]
Influx --> DashboardAPI[Dashboard API]
Influx --> QueryAPI[LLM Query API]
QueryAPI --> ClientApps[City Dashboards / Analysts]
DashboardAPI --> ClientApps
Kafka --> Detector[Anomaly Detector]
Detector --> WebSocket[WebSocket Server]
WebSocket --> ClientApps
flowchart TB
subgraph DockerHost[Docker Host]
subgraph Compose[Docker Compose Network]
Kafka[(Kafka)]
Influx[(InfluxDB)]
Dashboard[Dashboard API]
Query[LLM Query API]
Detector[Anomaly Detector]
WS[WebSocket Server]
end
end
Users[Dashboards / Analysts] --> Dashboard
Users --> Query
Users --> WS
- Dashboard API: Aggregated metrics for dashboards
- LLM Query API: Natural-language interface to time-series data
- Kafka Producers: Sensor stream ingestion and simulation
- Kafka Consumers: Stream processing and persistence to InfluxDB
- Anomaly Detector: Rule-based and statistical checks
- WebSocket Server: Live updates to clients
- Sensors or simulators publish to Kafka topics
- Consumers persist metrics in InfluxDB
- APIs query InfluxDB for dashboards and analysis
- WebSocket server pushes real-time updates
- Detector flags anomalies and emits alerts
- Backend: Python 3, Flask
- Streaming: Apache Kafka
- Time-Series Storage: InfluxDB
- LLM Integration: External API provider (configured by env)
- Orchestration: Docker Compose
- Real-Time: WebSockets
digitaltwindenboschbackend/
├── apis/
│ ├── dashboard_api.py
│ ├── llm_influx_query_engine.py
│ └── explainer.py
├── config/
│ ├── docker-compose.yml
│ ├── docker-compose-new.yml
│ ├── telegraf.conf
│ └── telegraf_new.conf
├── consumers/
│ ├── kafka_consumer_influx.py
│ └── kafka_consumer_anomalies.py
├── data/
│ └── *.csv
├── detectors/
│ ├── anomaly_detector_websocket.py
│ └── detector_evaluation.py
├── docs/
│ └── README.md
├── evaluation/
│ └── *.csv
├── producers/
│ ├── kafka_producer_simulator.py
│ └── kafka_simulator_correlation.py
├── tests/
│ └── quick_test.py
├── utils/
│ ├── metrics_reader.py
│ ├── socket_client_tester.py
│ ├── websocket_server_emitter.py
│ ├── odin_metrics.py
│ └── odin_brain.py
├── .env.example
├── README.md
└── requirements.txt
cp .env.example .env
# Edit .env with real values
docker-compose -f config/docker-compose.yml up --build -dOnce running:
- WebSocket/Socket.IO:
http://localhost:5000 - Dashboard API health:
http://localhost:5001/health - LLM Query API health:
http://localhost:5050/health
cp .env.example .env
# Edit .env with real values
docker-compose -f config/docker-compose.yml up --build -dVerify health endpoints:
curl http://localhost:5001/health
curl http://localhost:5050/healthThe real-time UI should connect to the WebSocket service exposed by the anomaly/streaming server.
- WebSocket (Socket.IO):
http://localhost:5000 - Events:
anomaly,kafka_data,heartbeat
If your UI expects a plain WebSocket URL, use ws://localhost:5000 and configure Socket.IO transport accordingly.
Create a local .env file based on .env.example.
Required variables:
- INFLUX_URL
- INFLUX_TOKEN
- INFLUX_ORG
- BUCKET
- HYPERBOLIC_API_KEY (or equivalent LLM provider key)
- KAFKA_BOOTSTRAP_SERVERS
config/docker-compose.yml: primary, stable Compose file (local + standard deployments)config/docker-compose-new.yml: experimental/alternate stack; use only if you need newer service wiringconfig/telegraf.conf: default Telegraf configurationconfig/telegraf_new.conf: alternate Telegraf configuration for experimental stack
Install dependencies and run services manually:
pip install -r requirements.txt
# Terminal 1: Dashboard API
python apis/dashboard_api.py
# Terminal 2: LLM Query API
python apis/llm_influx_query_engine.py
# Terminal 3: Kafka Producer
python producers/kafka_producer_simulator.py
# Terminal 4: Kafka Consumer
python consumers/kafka_consumer_influx.pyThis stack is multi-service and runs cleanly on a Linux VM using Docker Compose. The steps below reference official Azure and Docker documentation.
-
Create an Ubuntu Linux VM in Azure.
- Use the Azure portal quickstart for VM creation and SSH access.
-
Install Docker Engine and Docker Compose on the VM.
- Follow Docker’s official Ubuntu installation guide.
-
Deploy the stack.
# On the VM
git clone <repository-url>
cd digitaltwindenboschbackend
cp .env.example .env
# Edit .env with production values
docker-compose -f config/docker-compose.yml up --build -d- Open required ports in Azure NSG:
- 5001 (Dashboard API)
- 5050 (LLM Query API)
- 8080 (WebSocket)
- 8086 (InfluxDB, if needed externally)
Health checks:
curl http://localhost:5001/health
curl http://localhost:5050/healthLogs:
docker-compose -f config/docker-compose.yml logs -f- Keep
.envout of version control. - Rotate tokens regularly.
- Use private subnets and NSG rules to limit external exposure.
- For production, front APIs with a reverse proxy and TLS termination.
- Azure portal: Create a Linux VM: https://learn.microsoft.com/en-us/azure/virtual-machines/linux/quick-create-portal
- Docker Engine on Ubuntu: https://docs.docker.com/engine/install/ubuntu/
This project is licensed under the Apache License 2.0. See LICENSE.