Finally!
This is needed for me to work more on PixelPolygot since I want to do more testing with local models. So glad this is just supported in llama.cpp naturally since the #rocm port of koboldcpp seems dead in the water.
https://github.com/ggml-org/llama.cpp/pull/12898
This is needed for me to work more on PixelPolygot since I want to do more testing with local models. So glad this is just supported in llama.cpp naturally since the #rocm port of koboldcpp seems dead in the water.
https://github.com/ggml-org/llama.cpp/pull/12898
- replies
- 0
- announces
- 0
- likes
- 1