FitForge is a hands-on notebook for exploring and visualizing advanced regression techniques on the diabetes dataset. It features custom implementations and comparisons of:
- Linear Regression (with scikit-learn)
- Ridge Regression (both closed-form and gradient descent, from scratch)
- Lasso Regression (gradient descent, from scratch)
- Polynomial Feature Expansion
- ElasticNet Regression
- Rich Visualizations for model comparison and regularization effects
- Loads the diabetes dataset (10 features, 442 samples)
- Splits data into training and test sets
- Trains a baseline model
- Evaluates with RΒ² and RMSE
- Closed-form solution: Custom implementation
- Gradient Descent: Custom implementation
- Polynomial Ridge: scikit-learn pipeline with degree-16 features
- Visualization: Effect of different alpha values on fit
- Gradient Descent: Custom implementation with L1 penalty
- Polynomial Lasso: scikit-learn pipeline
- Visualization: Sparsity and feature selection effects
- Alpha Progression Table: Shows impact of regularization
- Combines L1 and L2 penalties
- Customizable
alphaandl1_ratio - Evaluates on test set
- Ridge & Lasso: Demonstrates how increasing regularization smooths predictions and shrinks coefficients
- Lasso: Shows automatic feature selection (sparsity)
- ElasticNet: Balances Ridge and Lasso effects
- Visuals: Colorful plots for each method, with overlays for different alpha values
- Open
FitForge.ipynbin Jupyter or Colab - Run all cells sequentially
- Explore the outputs, plots, and code for each regression method
- Adjust
alpha,degree, andl1_ratioto see their effects
- Custom Ridge and Lasso from scratch (not just scikit-learn!)
- Interactive, step-by-step visualizations
- Clear demonstration of regularization and feature selection
- All code and results in a single, easy-to-follow notebook
Feel free to open issues or suggest improvements!