From 59311d8b1e318b4280c15efdb2141074058bf76a Mon Sep 17 00:00:00 2001 From: Mauro Morales Date: Tue, 9 Sep 2025 16:40:55 +0200 Subject: [PATCH] Point to LocalAI-examples repo for llava (#6241) Signed-off-by: Mauro Morales --- docs/content/docs/features/gpt-vision.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/content/docs/features/gpt-vision.md b/docs/content/docs/features/gpt-vision.md index 1fc4307f6..52911aaf1 100644 --- a/docs/content/docs/features/gpt-vision.md +++ b/docs/content/docs/features/gpt-vision.md @@ -34,5 +34,5 @@ Grammars and function tools can be used as well in conjunction with vision APIs: All-in-One images have already shipped the llava model as `gpt-4-vision-preview`, so no setup is needed in this case. -To setup the LLaVa models, follow the full example in the [configuration examples](https://github.com/mudler/LocalAI/blob/master/examples/configurations/README.md#llava). +To setup the LLaVa models, follow the full example in the [configuration examples](https://github.com/mudler/LocalAI-examples/blob/main/configurations/llava/llava.yaml).