Skip to content

SouravLenka/ISL-Project

Repository files navigation

✋ ISL Gesture Recognition using Streamlit

This project is a real-time Indian Sign Language (ISL) hand gesture recognition system developed as a 5th Semester Minor Project. It uses computer vision and machine learning techniques to detect and classify ISL hand gestures directly through a web browser.


🧠 Project Motivation

Indian Sign Language (ISL) is an essential communication medium for the deaf and hard-of-hearing community. This project explores how hand landmark detection and machine learning can be combined to recognize ISL gestures in real time and serve as a foundation for accessible communication systems.


🛠️ Tech Stack

  • Streamlit – Web interface and deployment (Streamlit Cloud)
  • OpenCV – Image capture and preprocessing
  • MediaPipe – Hand landmark detection
  • Scikit-learn – Machine learning framework
  • RandomForestClassifier – Gesture classification model
  • Python – Core programming language

📌 Features

  • 📷 Real-time gesture capture using a webcam
  • 🖐️ Accurate hand landmark extraction using MediaPipe
  • 🗂️ Dataset generation from live gesture images
  • 🧠 Machine learning-based gesture classification
  • 🌐 Browser-based inference using Streamlit
  • ☁️ Deployable on Streamlit Cloud for free

📊 Dataset Details

  • ISL alphabet hand gestures
  • ~1000 images collected per gesture
  • Images captured using webcam under controlled conditions
  • Hand landmarks extracted and used as feature vectors

The dataset was manually collected and processed to ensure consistency and reliable training performance.


🧪 Machine Learning Pipeline

  1. Capture gesture images using OpenCV
  2. Extract hand landmarks using MediaPipe
  3. Convert landmarks into numerical feature vectors
  4. Train a Random Forest classifier on the dataset
  5. Perform real-time inference through Streamlit

🎓 Academic Context

This project was completed as a 5th Semester Minor Project as part of the undergraduate curriculum. The work involved collaborative dataset creation, model training, testing, and deployment.


👥 Contributors

  • Sai Subham Sahu
  • Sourav Lenka
  • Himanshu Singh

All contributors were involved in data collection, model development, and system implementation.


📜 Note

This project is intended for academic and learning purposes. It demonstrates the application of machine learning and computer vision techniques for real-time gesture recognition.

About

A project focused on recognizing and interpreting Indian Sign Language (ISL) using computer vision and machine learning techniques. The system processes visual inputs to identify hand gestures and map them to corresponding signs, aiming to improve accessibility and communication.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages