diff --git a/docs/content/docs/advanced/advanced-usage.md b/docs/content/docs/advanced/advanced-usage.md index 2ddea421b..8feaf9518 100644 --- a/docs/content/docs/advanced/advanced-usage.md +++ b/docs/content/docs/advanced/advanced-usage.md @@ -95,7 +95,7 @@ Specifying a `config-file` via CLI allows to declare models in a single file as chat: chat ``` -See also [chatbot-ui](https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-ui) as an example on how to use config files. +See also [chatbot-ui](https://github.com/mudler/LocalAI-examples/tree/main/chatbot-ui) as an example on how to use config files. It is possible to specify a full URL or a short-hand URL to a YAML model configuration file and use it on start with local-ai, for example to use phi-2: @@ -341,7 +341,7 @@ Below is an instruction that describes a task, paired with an input that provide Instead of installing models manually, you can use the LocalAI API endpoints and a model definition to install programmatically via API models in runtime. -A curated collection of model files is in the [model-gallery](https://github.com/go-skynet/model-gallery) (work in progress!). The files of the model gallery are different from the model files used to configure LocalAI models. The model gallery files contains information about the model setup, and the files necessary to run the model locally. +A curated collection of model files is in the [model-gallery](https://github.com/mudler/LocalAI/tree/master/gallery). The files of the model gallery are different from the model files used to configure LocalAI models. The model gallery files contains information about the model setup, and the files necessary to run the model locally. To install for example `lunademo`, you can send a POST call to the `/models/apply` endpoint with the model definition url (`url`) and the name of the model should have in LocalAI (`name`, optional): diff --git a/docs/content/docs/faq.md b/docs/content/docs/faq.md index c1dc24ec7..46fd8c849 100644 --- a/docs/content/docs/faq.md +++ b/docs/content/docs/faq.md @@ -46,7 +46,7 @@ There is the availability of localai-webui and chatbot-ui in the examples sectio ### Does it work with AutoGPT? -Yes, see the [examples](https://github.com/go-skynet/LocalAI/tree/master/examples/)! +Yes, see the [examples](https://github.com/mudler/LocalAI-examples)! ### How can I troubleshoot when something is wrong? diff --git a/docs/content/docs/features/embeddings.md b/docs/content/docs/features/embeddings.md index 7e0f3abf4..e6464634f 100644 --- a/docs/content/docs/features/embeddings.md +++ b/docs/content/docs/features/embeddings.md @@ -75,4 +75,4 @@ curl http://localhost:8080/embeddings -X POST -H "Content-Type: application/json ## 💡 Examples -- Example that uses LLamaIndex and LocalAI as embedding: [here](https://github.com/go-skynet/LocalAI/tree/master/examples/query_data/). +- Example that uses LLamaIndex and LocalAI as embedding: [here](https://github.com/mudler/LocalAI-examples/tree/main/query_data). diff --git a/docs/content/docs/features/openai-functions.md b/docs/content/docs/features/openai-functions.md index 5d43ece03..2f0c8c419 100644 --- a/docs/content/docs/features/openai-functions.md +++ b/docs/content/docs/features/openai-functions.md @@ -263,4 +263,4 @@ Grammars and function tools can be used as well in conjunction with vision APIs: ## 💡 Examples -A full e2e example with `docker-compose` is available [here](https://github.com/go-skynet/LocalAI/tree/master/examples/functions). +A full e2e example with `docker-compose` is available [here](https://github.com/mudler/LocalAI-examples/tree/main/functions). diff --git a/docs/content/docs/getting-started/models.md b/docs/content/docs/getting-started/models.md index a57fa9942..08a25e982 100644 --- a/docs/content/docs/getting-started/models.md +++ b/docs/content/docs/getting-started/models.md @@ -207,4 +207,4 @@ For instructions on building LocalAI from source, see the [Build Section]({{% re {{% /tab %}} {{< /tabs >}} -For more model configurations, visit the [Examples Section](https://github.com/mudler/LocalAI/tree/master/examples/configurations). +For more model configurations, visit the [Examples Section](https://github.com/mudler/LocalAI-examples/tree/main/configurations). diff --git a/docs/content/docs/whats-new.md b/docs/content/docs/whats-new.md index e4f7ab25c..320d0dca1 100644 --- a/docs/content/docs/whats-new.md +++ b/docs/content/docs/whats-new.md @@ -99,8 +99,8 @@ Thanks to the community efforts now we have a new [how-to website](https://io.mi #### 💡 More examples! -- Open source autopilot? See the new addition by {{< github "gruberdev" >}} in our [examples](https://github.com/go-skynet/LocalAI/tree/master/examples/continue) on how to use Continue with LocalAI! -- Want to try LocalAI with Insomnia? Check out the new [Insomnia example](https://github.com/go-skynet/LocalAI/tree/master/examples/insomnia) by {{< github "dave-gray101" >}}! +- Open source autopilot? See the new addition by {{< github "gruberdev" >}} in our [examples](https://github.com/mudler/LocalAI-examples/tree/main/continue) on how to use Continue with LocalAI! +- Want to try LocalAI with Insomnia? Check out the new [Insomnia example](https://github.com/mudler/LocalAI-examples/tree/main/insomnia) by {{< github "dave-gray101" >}}! #### LocalAGI in discord! @@ -258,7 +258,7 @@ And here when it actually picks to reply to the user instead of using functions! Note: functions are supported only with `llama.cpp`-compatible models. -A full example is available here: https://github.com/go-skynet/LocalAI/tree/master/examples/functions +A full example is available here: https://github.com/mudler/LocalAI-examples/tree/main/functions ### gRPC backends @@ -377,9 +377,9 @@ We now support a vast variety of models, while being backward compatible with pr ### Examples -- 💡 [AutoGPT](https://github.com/go-skynet/LocalAI/tree/master/examples/autoGPT) example ( [mudler](https://github.com/mudler) ) -- 💡 [PrivateGPT](https://github.com/go-skynet/LocalAI/tree/master/examples/privateGPT) example ( [mudler](https://github.com/mudler) ) -- 💡 [Flowise](https://github.com/go-skynet/LocalAI/tree/master/examples/flowise) example ( [mudler](https://github.com/mudler) ) +- 💡 [AutoGPT](https://github.com/mudler/LocalAI-examples/tree/main/autoGPT) example ( [mudler](https://github.com/mudler) ) +- 💡 [PrivateGPT](https://github.com/mudler/LocalAI-examples/tree/main/privateGPT) example ( [mudler](https://github.com/mudler) ) +- 💡 [Flowise](https://github.com/mudler/LocalAI-examples/tree/main/flowise) example ( [mudler](https://github.com/mudler) ) Two new projects offer now direct integration with LocalAI! @@ -449,7 +449,7 @@ Now LocalAI can generate images too: - 14-05-2023: __v1.11.1__ released! `rwkv` backend patch release - 13-05-2023: __v1.11.0__ released! 🔥 Updated `llama.cpp` bindings: This update includes a breaking change in the model files ( https://github.com/ggerganov/llama.cpp/pull/1405 ) - old models should still work with the `gpt4all-llama` backend. -- 12-05-2023: __v1.10.0__ released! 🔥🔥 Updated `gpt4all` bindings. Added support for GPTNeox (experimental), RedPajama (experimental), Starcoder (experimental), Replit (experimental), MosaicML MPT. Also now `embeddings` endpoint supports tokens arrays. See the [langchain-chroma](https://github.com/go-skynet/LocalAI/tree/master/examples/langchain-chroma) example! Note - this update does NOT include https://github.com/ggerganov/llama.cpp/pull/1405 which makes models incompatible. +- 12-05-2023: __v1.10.0__ released! 🔥🔥 Updated `gpt4all` bindings. Added support for GPTNeox (experimental), RedPajama (experimental), Starcoder (experimental), Replit (experimental), MosaicML MPT. Also now `embeddings` endpoint supports tokens arrays. See the [langchain-chroma](https://github.com/mudler/LocalAI-examples/tree/main/langchain-chroma) example! Note - this update does NOT include https://github.com/ggerganov/llama.cpp/pull/1405 which makes models incompatible. - 11-05-2023: __v1.9.0__ released! 🔥 Important whisper updates ( {{< pr "233" >}} {{< pr "229" >}} ) and extended gpt4all model families support ( {{< pr "232" >}} ). Redpajama/dolly experimental ( {{< pr "214" >}} ) - 10-05-2023: __v1.8.0__ released! 🔥 Added support for fast and accurate embeddings with `bert.cpp` ( {{< pr "222" >}} ) - 09-05-2023: Added experimental support for transcriptions endpoint ( {{< pr "211" >}} )