From a28f27604aa33198c940240118b50d162b87a6d5 Mon Sep 17 00:00:00 2001 From: Ettore Di Giacinto Date: Thu, 24 Jul 2025 16:18:25 +0200 Subject: [PATCH] Update backends.md Signed-off-by: Ettore Di Giacinto --- docs/content/docs/features/backends.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/content/docs/features/backends.md b/docs/content/docs/features/backends.md index 83b4cc1dd..2c80b91a1 100644 --- a/docs/content/docs/features/backends.md +++ b/docs/content/docs/features/backends.md @@ -96,8 +96,8 @@ Your backend container should: For getting started, see the available backends in LocalAI here: https://github.com/mudler/LocalAI/tree/master/backend . - For Python based backends there is a template that can be used as starting point: https://github.com/mudler/LocalAI/tree/master/backend/python/common/template . -- For Golang based backends, you can see the `bark-cpp` backend as an example: https://github.com/mudler/LocalAI/tree/master/backend/go/bark -- For C++ based backends, you can see the `llama-cpp` backend as an example: https://github.com/mudler/LocalAI/tree/master/backend/cpp/llama +- For Golang based backends, you can see the `bark-cpp` backend as an example: https://github.com/mudler/LocalAI/tree/master/backend/go/bark-cpp +- For C++ based backends, you can see the `llama-cpp` backend as an example: https://github.com/mudler/LocalAI/tree/master/backend/cpp/llama-cpp ### Publishing Your Backend