Logo
Explore Help
Register Sign In
mirror/LocalAI
1
0
Fork 0
You've already forked LocalAI
mirror of https://github.com/mudler/LocalAI.git synced 2026-01-09 20:20:10 -06:00
Code Issues Packages Projects Releases Wiki Activity
Files
ba66aa33c5f7c7acfc2460fcfaa2250fc59685d2
LocalAI/pkg
History
Bas Hulsken bbf30d416d fix: change initialization order of llama-cpp-avx512 to go before avx2 variant (#4837)
changed to initialization order of the avx512 version of llama.cpp, now tries before avx2

Signed-off-by: Bas Hulsken <bhulsken@hotmail.com>
2025-02-17 09:32:21 +01:00
..
assets
chore: fix go.mod module (#2635)
2024-06-23 08:24:36 +00:00
concurrency
chore: update jobresult_test.go (#4124)
2024-11-12 08:52:18 +01:00
downloader
chore(downloader): support hf.co and hf:// URIs (#4677)
2025-01-24 08:27:22 +01:00
functions
feat(llama.cpp): Add support to grammar triggers (#4733)
2025-02-02 13:25:03 +01:00
grpc
feat: stream tokens usage (#4415)
2024-12-18 09:48:50 +01:00
langchain
…
library
rf: centralize base64 image handling (#2595)
2024-06-24 08:34:36 +02:00
model
fix: change initialization order of llama-cpp-avx512 to go before avx2 variant (#4837)
2025-02-17 09:32:21 +01:00
oci
chore: fix go.mod module (#2635)
2024-06-23 08:24:36 +00:00
startup
chore: drop embedded models (#4715)
2025-01-30 00:03:01 +01:00
store
chore: fix go.mod module (#2635)
2024-06-23 08:24:36 +00:00
templates
feat(template): read jinja templates from gguf files (#4332)
2024-12-08 13:50:33 +01:00
utils
feat(tts): Implement naive response_format for tts endpoint (#4035)
2024-11-02 19:13:35 +00:00
xsync
chore: fix go.mod module (#2635)
2024-06-23 08:24:36 +00:00
xsysinfo
…
Powered by Gitea Version: 1.25.3 Page: 1650ms Template: 19ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API