Skip to content

Latest commit

 

History

History
57 lines (34 loc) · 2.56 KB

File metadata and controls

57 lines (34 loc) · 2.56 KB

Deep Neural Network with NumPy

This repository contains a fully implemented Deep Neural Network (DNN) using NumPy only, without the use of any standard machine learning libraries. The model demonstrates fundamental concepts of forward propagation, cost function calculation, and backpropagation, with support for multiple hidden layers and activation functions.

Overview

This DNN model is capable of performing binary classification tasks effectively. The code was developed as a hands-on project while following the "Neural Networks and Deep Learning" course by Andrew Ng from Deeplearning.AI. The practice dataset provided in the course was used for this implementation.

With some adjustments to hyperparameters, this code can be adapted for other binary classification datasets.


Features

  • Forward Propagation: Handles multiple hidden layers with ReLU and sigmoid activation functions.
  • Cost Function: Implements a binary cross-entropy cost function to evaluate performance.
  • Backpropagation: Calculates gradients for weight and bias updates, including derived activation functions.
  • Hyperparameter Tuning: Easily customizable for learning rate, number of layers, and neurons per layer.
  • No External ML Libraries: Built entirely using NumPy, fostering a deeper understanding of neural networks.

Applications

  • Binary Classification Problems: Example use case - "Cat vs Not Cat" classification.
  • Educational Purpose: Understand the internals of how a neural network works.
  • Experimentation: Modify and extend the model for other tasks with minor tweaks.

How to Adapt for Your Dataset

  1. Update Input Data: Replace the dataset and preprocessing logic with your binary classification dataset.
  2. Adjust Hyperparameters: Modify parameters such as learning rate, number of layers, and number of neurons per layer.
  3. Train and Evaluate: Train the model on your dataset and evaluate its performance using the built-in functionality.

Acknowledgments

This project is inspired by the "Neural Networks and Deep Learning" course by Andrew Ng, which is a fantastic resource for understanding deep learning concepts.


Contact

For queries, discussions, or collaboration, feel free to reach out:


Contributing

Contributions are welcome! If you have ideas to improve this implementation or want to extend it, feel free to create a pull request or reach out directly.