Skip Navigation

llama.cpp for GPU only

I’ve been using llama.cpp, gpt-llama and chatbot-ui for a while now, and I’m very happy with it. However, I’m now looking into a more stable setup using only GPU. Is this llama.cpp still still a good candidate for that?

8

You're viewing a single thread.

8 comments
8 comments