Is it possible to run a LLM on a mini-pc like the GMKtec K8 and K9?
Is it possible to run a LLM on a mini-pc like the GMKtec K8 and K9?
I have experience in running servers, but I would like to know if it's possible to do it, I just need a GPT 3.5 like private LLM running.
You're viewing a single thread.
Look into ollama. It shouldn't be an issue if you stick to 7b parameter models
5 0 ReplyYeah, I did see something related to what you mentioned and I was quite interested. What about quantized models?
1 5 ReplyQuantized with more parameters is generally better than floating point with fewer parameters. If you can squeeze a 14b parameter model down to a 4-bit int quantization it'll still generally outperform a 16-bit Floating Point 7b parameter equivalent.
3 0 ReplyInteresting information mate, I'm documenting myself into the subject, thx for the help 👍👍
2 8 Reply
I don't have any experience with them honestly so I can't help you there
3 1 ReplyAppreciate you 👍👍
2 6 Reply