Skip to content

Popular repositories Loading

  1. exllamav2 exllamav2 Public

    A fast inference library for running LLMs locally on modern consumer-class GPUs

    Python 4.4k 328

  2. exllamav3 exllamav3 Public

    An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs

    Python 622 67

  3. exui exui Public

    Web UI for ExLlamaV2

    JavaScript 513 47

Repositories

Showing 3 of 3 repositories
  • exllamav3 Public

    An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs

    turboderp-org/exllamav3’s past year of commit activity
    Python 622 MIT 67 51 2 Updated Jan 26, 2026
  • exllamav2 Public

    A fast inference library for running LLMs locally on modern consumer-class GPUs

    turboderp-org/exllamav2’s past year of commit activity
    Python 4,429 MIT 328 136 23 Updated Dec 9, 2025
  • exui Public

    Web UI for ExLlamaV2

    turboderp-org/exui’s past year of commit activity
    JavaScript 513 MIT 47 34 3 Updated Feb 5, 2025

Top languages

Loading…

Most used topics

Loading…