forked from ggerganov/llama.cpp
-
Notifications
You must be signed in to change notification settings - Fork 322
Issues: LostRuins/koboldcpp
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Silent audio track emits deep buzzing sound + another small issue
#998
opened Jul 15, 2024 by
pbz134
AMD GPU VRAM idling at 0mhz tanking performance on Vulkan and ROCm
#978
opened Jul 8, 2024 by
YW5555
New API feature that generates one token but shows its logits / probabilities ("Manual sampling" mode)
enhancement
New feature or request
low priority
#975
opened Jul 7, 2024 by
aleksusklim
Kobold UI's openai compatible API not working with runpod (Serverless vLLM)
#973
opened Jul 5, 2024 by
morbidCode
Incorporate Windows ARM64 binaries
enhancement
New feature or request
#969
opened Jul 3, 2024 by
77poker125
Enable streaming on KoboldAI Lite when using remote hosts
enhancement
New feature or request
#966
opened Jul 2, 2024 by
morbidCode
Another
llama_new_context_with_model: failed to initialize Metal backend
issue
#964
opened Jul 2, 2024 by
RDearnaley
Previous Next
ProTip!
Updated in the last three days: updated:>2024-07-15.