Skip to content

longway2go-ai/MiniLM-A-Small-Language-Model-from-Scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

8 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🧠 MiniLM: A Small Language Model from Scratch

MiniLM is a small, efficient, and educational language model built from scratch, inspired by Karpathy's nanoGPT and the TinyStories research paper by Microsoft. This project is designed to help understand the full lifecycle of training a Transformer-based model, from data preparation to inference, in an accessible and minimalist codebase.


πŸš€ Features

  • βš™οΈ Pure PyTorch implementation β€” no external training libraries
  • πŸ“š Clean, well-commented code for learning and experimentation
  • πŸ—οΈ Modular Transformer architecture
  • 🧾 Token-level and character-level tokenization options
  • πŸ“‰ Training on TinyStories-like synthetic dataset (or your own!)
  • πŸ“¦ Efficient training with gradient accumulation and mixed precision
  • πŸ§ͺ Tiny inference script to generate new text from a prompt

πŸ“– Background

  • nanoGPT: A compact reimplementation of OpenAI’s GPT-style models by Andrej Karpathy, focusing on readability and simplicity.
  • TinyStories: A Microsoft Research paper demonstrating that small transformer models (under 10M parameters) can be trained to generate coherent short stories when trained on domain-specific synthetic datasets.

This project blends both ideas: the simplicity and training loop style of nanoGPT with the scale and dataset philosophy of TinyStories.


πŸ—οΈ Architecture

  • GPT-style Transformer decoder
  • Causal self-attention
  • Positional embeddings
  • Configurable depth and width (e.g., 2-6 layers, 128-512 hidden units)
  • Dropout for regularization

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors