Binary classification has always been an important mission for machine learning, be it diagnosing cancer, detecting fraud or just classifying images as dogs and cats :) Thus, we have so many different algorithms. In this notebook, I've tried to implement one of the oldest and most used binary classification algorithms: Logistic Regression.
We all use practical solutions like scikit-learn library and many other machine learning libraries, but this time I wanted to implement everything (from data processing to gradient descent) from scratch to better understand how algorithms and methods work better. I've also included formulas for the sigmoid, logreg cost function, derivatives of the cost function to make the notebook more understandable. And also, I've implemented a really important technique called regularization. Regularization helps us prevent overfitting and improves general performance of our model when we use it with new data.
I hope you enjoy :) Feel free to ask me anything!