* WIP - add endpoint
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Rename
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Wire the Completion API
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Try to make it functional
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Almost functional
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Bump golang versions used in tests
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Add description of the tool
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Make it working
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Small optimizations
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Cleanup/refactor
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Update docs
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
---------
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
CI: disable testing on PRs against arm64
Removed configuration for cublas and arm64 platform.
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
* chore(cudds): add cudds to l4t images
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* add arm64 to CI tests
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
---------
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Rename tag suffix for hipblas whisper to match backend config
hipblas images generally have the suffix `-gpu-rocm-hipblas-X`. One exception to this currently is the hipblas build of Whisper which has the suffix `gpu-hipblas-whisper.
However, as `backend/index.yaml` references the image tag for Whisper using the more consistent form (i.e. `latest-gpu-rocm-hipblas-whisper`), it is not possible to add the backend as raised in #6114.
Therefore, rename the suffix for hipblas whisper images to use the more consistent form, aligning with other hipblas builds as well as the expected image name in `backend/index.yaml`.
Signed-off-by: Kingsley Jarrett <kj@kingj.net>
* chore(ci): Build Go based backends on Darwin
Signed-off-by: Richard Palethorpe <io@richiejp.com>
* chore(stablediffusion-ggml): Fixes for building on Darwin
Signed-off-by: Richard Palethorpe <io@richiejp.com>
* chore(whisper): Build on Darwin
Signed-off-by: Richard Palethorpe <io@richiejp.com>
---------
Signed-off-by: Richard Palethorpe <io@richiejp.com>
* feat(mlx-audio): Add mlx-audio backend
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* improve loading
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* CI tests
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* fix: set C_INCLUDE_PATH to point to python install
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
---------
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Add launcher (WIP)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Update gomod
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Cleanup, focus on systray
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Separate launcher from main
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Add a way to identify the binary version
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Implement save config, and start on boot
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Small fixups
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Save installed version as metadata
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Stop LocalAI on quit
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Fix goreleaser
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Check first if binary is there
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* do not show version if we don't have it
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Try to build on CI
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* use fyne package
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Add to release
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Fixups
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Fyne.Do
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* show WEBUI button only if LocalAI is started
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Default to localhost
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* CI
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Show rel notes
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Update logo
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Small improvements and fix tests
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Try to fix e2e tests
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
---------
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* feat(backends): bundle python
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* test ci
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* vllm on self-hosted
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Add clang
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Try to fix it for Mac
* Relocate links only when is portable
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Make sure to call macosPortableEnv
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Use self-hosted for vllm
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* Fixups
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
* CI
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
---------
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>