This project automatically collects traffic data from Rennes via an API, archives the results daily, and allows you to consult the archives through a web interface.
- Automatic collection: Queries the RennesOpenData API every 5 minutes.
- Daily archiving: Stores each response in a JSON file within a date-named folder (
YYYY-MM-DD). Archives (zips) and deletes the previous day's folder at each day change. - Archive server: Lets you browse the archives through a simple web interface.
- Logging and error handling: Errors are managed with retries and logged in
script.log.
- Python 3.11+
- pip
pip install -r requirements.txtStart the collection script:
python fetch_and_archive.pyThe script will run continuously, automatically handling storage, archiving, and folder deletion.
Start the archive server:
flask --app archive_server:app run --debugOpen your browser at http://localhost:5000 to access the web interface and browse the archives.
# run everything
docker-compose up -d
# run only the data collector
docker-compose up fetch-and-archive -d
# run only the web server
docker-compose up archive-server -dfetch_and_archive.py: Data collection and archiving script.archive_server.py: Web server to browse the archives.archives/: Folder containing daily archives zip files (YYYY-MM-DD.zip).script.log: Log file.templates/andstatic/: Files for the web interface.