Skip to content

llamafile v0.8.6

Latest
Compare
Choose a tag to compare
@jart jart released this 25 May 14:27
· 9 commits to main since this release
81cfbcf

Two minor issues are fixed with this release.

  • 69c2dd3 Don't print special tokens for now (improve shell scriptability)
  • 866a129 Upgrade to Cosmopolitan v3.3.8

See the llamafile v0.8.5 release notes for further details. For driver-only prebuilt AMD GPU support on Windows, please use llamafile v0.8.4 for the next few weeks, until ggerganov/llama.cpp#7156 is resolved.