This project demonstrates how to set up a real-time sensor data dashboard using a Raspberry Pi, Apache Kafka, and Streamlit. The setup involves a Raspberry Pi that reads temperature and humidity data from a DHT11 sensor, a Kafka producer to send the data to a Kafka topic, a Kafka consumer to receive the data and store it in a CSV file, and a Streamlit app to visualize the data in real-time.
- Raspberry Pi with DHT11 sensor connected.
- Apache Kafka installed and running.
- Python installed on both the Raspberry Pi and the consumer machine.
- Necessary Python packages installed.
- Download and install Kafka from Apache Kafka.
- Start the Kafka server and create a topic named
sensor_data
.
bin/zookeeper-server-start.sh config/zookeeper.properties
bin/kafka-server-start.sh config/server.properties
bin/kafka-topics.sh --create --topic sensor_data --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1
pip install kafka-python adafruit-circuitpython-dht psutil
sudo apt-get install libgpiod2
pip install kafka-python pandas
pip install streamlit pandas
- Put the producer python file on the Raspberry and customize it according to the sensors you are using.
- The consumer python file will be ran on the system.
- To view the data in realtime put the app python file in the same folder as the consumer and run it.
- Running only consumer python file without the app will just store all the data sent from the producer and can be viewed later.