Releases: ngxson/llama.cpp
Releases · ngxson/llama.cpp
b4052
metal : improve clarity (minor) (#10171)
b4050
swift : exclude ggml-metal-embed.metal (#10211) * llama.swift : exclude ggml-metal-embed.metal * swift : exclude build/
b4048
server : revamp chat UI with vuejs and daisyui (#10175) * server : simple chat UI with vuejs and daisyui * move old files to legacy folder * embed deps into binary * basic markdown support * add conversation history, save to localStorage * fix bg-base classes * save theme preferences * fix tests * regenerate, edit, copy buttons * small fixes * docs: how to use legacy ui * better error handling * make CORS preflight more explicit * add GET method for CORS * fix tests * clean up a bit * better auto scroll * small fixes * use collapse-arrow * fix closeAndSaveConfigDialog * small fix * remove console.log * fix style for <pre> element * lighter bubble color (less distract when reading)
b4044
ggml : add ggml-cpu.h to the public headers (#10204)
b4042
DRY: Fixes clone functionality (#10192)
b4041
fix q4_0_8_8 format for corrupted tokens issue (#10198) Co-authored-by: EC2 Default User <[email protected]>
b4040
Optimize RWKV6 Operator Naming and Implement Multi-core CPU/ SYCL Acc…
b4038
server : remove hack for extra parallel slot (#10187) ggml-ci
b4037
metal : fix from ptr buffer name (#10189)
b4036
ggml : adjust is_first_call init value (#10193) ggml-ci