This project demonstrates a distributed WebSocket system with load balancing, multiple WebSocket servers, and a Redis-based message broker. The system is containerized using Docker and includes a client application built with Angular.
The system consists of the following components:
-
WebSocket Servers (2 instances)
- Node.js-based WebSocket servers
- Each server has a unique ID
- Uses Redis for message broadcasting between servers
- Handles WebSocket connections and message routing
-
Load Balancer (Nginx)
- Distributes WebSocket connections across multiple servers
- Provides high availability and scalability
- Configured for WebSocket protocol support
-
Redis
- Acts as a message broker between WebSocket servers
- Enables real-time message broadcasting across all server instances
- Persists data using a Docker volume
-
Client Application (Angular)
- Modern web interface for interacting with the WebSocket system
- Real-time message display and interaction
- Responsive design
- Docker and Docker Compose
- Node.js (for local development)
- Angular CLI (for local client development)
-
Clone the repository
git clone https://github.com/logeshkannan96/distributed-websockets.git cd distributed-websocket -
Start the system using Docker Compose
docker-compose up --build
This will start:
- Two WebSocket server instances
- Nginx load balancer
- Redis server
- The client application will be available at http://localhost:4200
-
For local development
Server Development:
cd server npm install npm run devClient Development:
cd client npm install ng serve
- Server instances are configured through environment variables in
docker-compose.yaml - Each server has a unique
SERVER_ID - Redis connection is configured via
REDIS_URL
- Nginx configuration is located in
lb/nginx.conf - Load balancing strategy can be modified in the configuration file
- Redis runs on the default port 6379
- Data persistence is configured through Docker volumes
- Open multiple browser windows to http://localhost:4200
- Connect to the WebSocket server
- Send messages from different clients
- Observe how messages are broadcasted across all connected clients
- Verify that the load balancer distributes connections across both server instances
-
Redis data can be monitored using Redis CLI:
docker exec -it distributed-websocket_redis_1 redis-cli -
Server logs can be viewed using:
docker-compose logs -f websocket-server-1 docker-compose logs -f websocket-server-2
-
Connection Issues
- Verify all containers are running:
docker-compose ps - Check server logs for errors
- Ensure Redis is accessible from WebSocket servers
- Verify all containers are running:
-
Load Balancer Issues
- Check Nginx configuration
- Verify WebSocket server health
- Check Nginx logs:
docker-compose logs nginx
- Fork the repository
- Create a feature branch
- Commit your changes
- Push to the branch
- Create a Pull Request
You can send messages to the WebSocket system using the REST API endpoint. Here are some examples:
-
Send a message using curl:
curl --location 'http://localhost/api/send-message' \ --header 'Content-Type: application/json' \ --data '{ "username": "kattappa", "message": "Swami, I have a confession to make" }'
-
Send a message with different content:
curl --location 'http://localhost/api/send-message' \ --header 'Content-Type: application/json' \ --data '{ "username": "bahubali", "message": "Kattappa, yes tell me" }'
The API accepts POST requests with the following JSON structure:
{
"username": "string",
"message": "string"
}Note: A similar implementation using Server-Sent Events (SSE) with Redis Pub/Sub is available in the following GitHub repository: 👉 logeshkannan96/sse-redispubsub Feel free to check it out for reference or inspiration.