I would like to request a feature enhancement for the vectra CLI functionality. Specifically, I would like to have the option to use local Large Language Models (LLMs) instead of relying solely on OpenAI's API.
Feature Details:
-
Local LLM Integration:
- Add support for integrating local LLMs, such as those provided by LM Studio, which allows responses similar to OpenAI.
- The integration should be flexible to accommodate other LLMs as well.
-
Global Configuration:
- Implement a global configuration option that allows users to specify their preferred LLM provider.
- The configuration should include necessary parameters for the chosen LLM, such as API keys or local model paths.
-
CLI Commands:
- Ensure that CLI commands can easily switch between different LLM providers based on the global configuration.
- Provide clear documentation and examples on how to set up and use local LLMs within the vectra CLI.
Benefits:
- This feature will provide users with more flexibility and control over their LLM integrations.
- It will allow users to utilize local models, which can be beneficial for privacy, cost, and offline usage scenarios.
Thank you for considering this feature request. I believe it will greatly enhance the functionality and usability of the vectra CLI.
Cheers,
Gregor
I would like to request a feature enhancement for the vectra CLI functionality. Specifically, I would like to have the option to use local Large Language Models (LLMs) instead of relying solely on OpenAI's API.
Feature Details:
Local LLM Integration:
Global Configuration:
CLI Commands:
Benefits:
Thank you for considering this feature request. I believe it will greatly enhance the functionality and usability of the vectra CLI.
Cheers,
Gregor