Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Using a Local LLM Instead of OpenAI for Enhanced Security #115

Closed
EyalPery opened this issue Jul 7, 2024 · 1 comment
Closed
Labels

Comments

@EyalPery
Copy link

EyalPery commented Jul 7, 2024

It would be highly beneficial to have the option to integrate Cover Agent with a local LLM instead of sending data to the OpenAI API. This feature would enhance security and data privacy for users who handle sensitive information.

Thanks!

@mrT23
Copy link
Contributor

mrT23 commented Jul 8, 2024

@EyalPery
you can definitely do that. We support litellm:
https://github.com/Codium-ai/cover-agent?tab=readme-ov-file#using-other-llms

For example, use ollama with local models:
https://litellm.vercel.app/docs/providers/ollama

Make sure to use high-quality code oriented models.

@mrT23 mrT23 added the answered label Jul 10, 2024
@mrT23 mrT23 closed this as completed Jul 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
2 participants