AI-Predictive-Maintenance-Hydro is an end-to-end industrial IoT solution designed for predictive maintenance in hydroelectric power stations. The system simulates SCADA sensor data, processes it through a robust data pipeline, and provides real-time monitoring and predictive analytics through Grafana dashboards.
- Real-time sensor data simulation and monitoring
- Predictive maintenance using machine learning
- Automated anomaly detection
- Performance optimization recommendations
- Cost reduction through preventive maintenance
- Historical data analysis and trend prediction
The project implements a modern microservices architecture:
- Data Generation Layer: Python-based sensor simulation
- Data Processing Layer: Node-RED for data transformation and routing
- Storage Layer: InfluxDB for time-series data management
- Visualization Layer: Grafana for data visualization and monitoring
- Integration Layer: Proposed Next.js frontend for enhanced dashboard access
To successfully deploy and run this project, ensure you have the following tools installed:
- Docker: For containerizing and running services
- WSL (Windows Subsystem for Linux): A Linux environment on Windows
- Python 3.8+: For simulating sensor data and preprocessing
- Node.js 18+: For the proposed frontend (optional)
- pnpm: A fast package manager for JavaScript projects (optional)
Note: Grafana, InfluxDB, and Node-RED are not prerequisites as they are provided as Docker containers.
.
βββ frontend/ # Proposed Next.js web interface (under development)
β βββ app/ # Next.js pages and API routes
β βββ components/ # Reusable React components
β βββ hooks/ # Custom React hooks
β βββ lib/ # Utility functions
β βββ public/ # Static assets
β βββ styles/ # Global styles
β βββ next.config.js # Next.js configuration file
β
βββ python_scripts/ # Sensor simulation scripts
β βββ simulate_sensors.py # Main simulation script
β βββ data_preprocessing.py # Data preprocessing utilities
β βββ model_training.py # AI model training script
β
βββ node_red/ # Node-RED flows and configurations
β βββ flows.json # Node-RED flow definitions
β βββ function_nodes/ # Custom function nodes
β
βββ config/ # Service configurations and environment variables
β βββ mosquitto.conf # MQTT broker configuration
β βββ influxdb.conf # InfluxDB configuration
β βββ grafana/ # Grafana dashboard configurations
β
βββ docker/ # Docker-related files and instructions
βββ requirements/ # Python dependency files
βββ data/ # Persistent data storage
β βββ influxdb/ # InfluxDB time-series data
β βββ mosquitto/ # MQTT broker data
β βββ grafana/ # Grafana dashboards and data
β
βββ logs/ # Application and error logs
βββ docker-compose.yml # Docker services configuration
βββ predictive_maintenance.csv # AI model training dataset
-
Start Core Services with Docker Compose:
docker-compose up -d
This will start the following services:
- MQTT Broker (Mosquitto) on port 1883 (for message queuing)
- InfluxDB on port 8086 (for time-series data storage)
- Node-RED on port 1880 (for data processing)
- Grafana on port 3000 (primary visualization platform)
-
Configure Core Services:
a. InfluxDB Setup:
- Access InfluxDB UI at
http://localhost:8086 - Login with default credentials (admin/admin123)
- Create a new bucket named "hydro_data"
- Generate an API token for Node-RED integration
- Configure data retention policies
- Set up data explorer queries for sensor data
- Configure data downsampling for long-term storage
- Set up continuous queries for data aggregation
b. Node-RED Configuration:
- Access Node-RED UI at
http://localhost:1880 - Import the provided flows from
node_red/flows.json - Configure MQTT nodes to subscribe to sensor topics
- Set up InfluxDB output nodes with your API token
- Configure function nodes for data transformation
- Set up debug nodes for monitoring data flow
- Configure error handling and retry mechanisms
- Set up data validation and cleaning nodes
c. Grafana Setup:
- Access Grafana at
http://localhost:3000 - Login with default credentials (admin/admin123)
- Add InfluxDB as a data source
- Import dashboards from
data/grafana - Configure alert rules for predictive maintenance
- Set up user authentication and permissions
- Configure dashboard variables and templates
- Set up notification channels for alerts
- Access InfluxDB UI at
-
Set Up Python Environment for Sensor Simulation:
python -m venv venv source venv/bin/activate # On Windows: .\venv\Scripts\activate pip install -r requirements/requirements.txt
-
Run the Sensor Simulation Script:
python python_scripts/simulate_sensors.py
This script simulates SCADA system sensor data including:
- Temperature readings (Β°C)
- Pressure measurements (bar)
- Vibration levels (mm/s)
- Flow rates (mΒ³/s)
- Power output (MW)
- Equipment status (operational/standby/maintenance)
- Water level (m)
- Turbine efficiency (%)
- Generator voltage (kV)
- Oil temperature (Β°C)
Note: The Kaggle dataset (
predictive_maintenance.csv) is optional and can be used for training the AI model, but the simulation script provides real-time data for testing and development.
The project implements a comprehensive predictive maintenance model that can be trained using either:
- Simulated Data: Real-time sensor data from the simulation script
- Historical Data: The provided Kaggle dataset
- Input Layer: Time-series sensor data
- Processing Layer:
- LSTM for temporal pattern recognition
- Random Forest for feature importance
- Isolation Forest for anomaly detection
- Output Layer:
- Failure probability prediction
- Maintenance recommendations
- Performance degradation metrics
- Time-series forecasting
- Pattern recognition
- Threshold-based alerts
- Confidence scoring
- Automated retraining
- Feature importance analysis
- Anomaly detection
- Maintenance scheduling optimization
# Example of model training configuration
model_config = {
'lstm_layers': [64, 32],
'forecast_horizon': 24, # hours
'confidence_threshold': 0.85,
'retraining_interval': '1d',
'features': [
'temperature',
'pressure',
'vibration',
'flow_rate',
'power_output'
]
}The frontend is designed to integrate with Grafana dashboards, providing:
- Custom navigation and layout
- User authentication and authorization
- Additional UI components and interactions
- Mobile-responsive design
-
Iframe Embedding:
// Example of embedding Grafana dashboard <iframe src="http://localhost:3000/d/your-dashboard-id" width="100%" height="600px" frameBorder="0" allowFullScreen />
-
Grafana API Integration:
// Example of fetching dashboard data
const fetchDashboardData = async () => {
const response = await fetch('http://localhost:3000/api/dashboards/db/your-dashboard-id', {
headers: {
'Authorization': `Bearer ${GRAFANA_API_KEY}`
}
});
return response.json();
};- Custom Dashboard Components:
- Create wrapper components for Grafana panels
- Implement custom navigation
- Add additional UI elements
To set up the frontend (optional):
-
Navigate to the Frontend Directory:
cd frontend -
Install Dependencies:
pnpm install
-
Configure Environment Variables:
NEXT_PUBLIC_GRAFANA_URL=http://localhost:3000 NEXT_PUBLIC_GRAFANA_API_KEY=your_api_key
-
Run the Development Server:
pnpm dev
Note: The frontend is still in development and not required for the core functionality of the system.
- Sensor Data Simulation: Python scripts generate realistic data streams
- Message Brokering: Mosquitto MQTT handles communication between services
- Time-Series Database: InfluxDB stores and manages sensor data
- Node-RED Workflows: For efficient data routing and processing
- Grafana Dashboards: Primary visualization platform with:
- Real-time sensor data monitoring
- Historical data analysis
- Custom alerts and notifications
- Predictive maintenance insights
- Mobile-responsive dashboards
- Automated report generation
- Custom plugin integration
- Role-based access control
- Grafana Integration: Seamless embedding of Grafana dashboards
- Modern UI: Built with Next.js 15 and Tailwind CSS
- Type Safety: Ensures code quality with TypeScript
- Dark Mode Support: Seamless theme toggling
- Custom Navigation: Enhanced user experience for dashboard navigation
- User Management: Role-based access control
- API Integration: RESTful endpoints for data access
- Real-time Updates: WebSocket integration for live data
The system follows a streamlined data flow for predictive maintenance:
- Sensor Data Simulation: Python scripts simulate hydro station sensor readings
- Data Ingestion: Mosquitto MQTT broker collects and forwards data
- Processing: Node-RED processes and routes data to InfluxDB
- Storage: InfluxDB manages time-series data for efficient querying
- Visualization: Grafana provides primary visualization through dashboards
- AI Predictions: Predictive maintenance models forecast potential equipment failures
- Frontend Integration: (Under Development) Next.js interface for enhanced Grafana dashboard access
- Data Processing: Node-RED
- Time-Series Database: InfluxDB v2.6
- Message Broker: Mosquitto MQTT v2.0.15
- Visualization: Grafana v10.0.0
- Language: Python 3.8+
- Libraries:
- Pandas for data manipulation
- NumPy for numerical operations
- Paho-MQTT for MQTT communication
- Scikit-learn for machine learning
- Platform: Docker
- Orchestration: Docker Compose
- Networking: Custom bridge network
- Framework: Next.js 15
- UI Library: React 19
- Styling: Tailwind CSS
- Type System: TypeScript
- State Management: React Hooks
- API Client: Axios/Fetch
- Package Manager: pnpm
- Linting: ESLint
- CSS Processing: PostCSS
- Version Control: Git
- CI/CD: GitHub Actions (planned)
For production deployment, consider using DigitalOcean, AWS, or Azure for hosting the Docker containers. Follow these steps for deployment:
-
Deploy Core Services:
docker-compose up -d
-
Configure Grafana:
- Set up authentication
- Import dashboards
- Configure data sources
- Set up alerts
-
Optional: Deploy Frontend (When Ready)
cd frontend pnpm build pnpm start
Note: The frontend deployment is optional and can be added once development is complete.
For assistance or inquiries, contact Us
- Special thanks to the open-source community for invaluable resources
- Gratitude to Kaggle for the predictive maintenance dataset
- Appreciation to all contributors and maintainers of the open-source tools used in this project
- Recognition to the industrial IoT community for best practices and standards