From ec11b61f67ca57f9d982b2aa88d8cfd3eabc6be4 Mon Sep 17 00:00:00 2001 From: Carl Sutton Date: Tue, 9 Apr 2024 10:09:37 +0200 Subject: [PATCH] chore: link to env docs --- docs/config-azure-openai.md | 3 +++ docs/config-local-lmstudio.md | 3 +++ docs/config-local-localai.md | 3 +++ docs/config-local-ollama.md | 3 +++ docs/config-openrouter.md | 3 +++ 5 files changed, 15 insertions(+) diff --git a/docs/config-azure-openai.md b/docs/config-azure-openai.md index a1a8d6657..ca6eb5948 100644 --- a/docs/config-azure-openai.md +++ b/docs/config-azure-openai.md @@ -20,6 +20,9 @@ If you have an `API Endpoint` and `API Key`, you can configure big-AGI as follow The deployed models are now available in the application. If you don't have a configured Azure OpenAI service instance, continue with the next section. +In addition to using the UI, configuration can also be done using +[environment variables](environment-variables.md). + ## Setting Up Azure ### Step 1: Azure Account & Subscription diff --git a/docs/config-local-lmstudio.md b/docs/config-local-lmstudio.md index bd06224b6..5d074bb1e 100644 --- a/docs/config-local-lmstudio.md +++ b/docs/config-local-lmstudio.md @@ -37,6 +37,9 @@ Check the URL and modify if different. 2. Enter the API URL: `http://localhost:1234` (modify if different) 3. Refresh by clicking on the `Models` button to load models from LM Studio +In addition to using the UI, configuration can also be done using +[environment variables](environment-variables.md). + ## Troubleshooting - **Missing @mui/material**: Execute `npm install @mui/material` or `yarn add @mui/material` diff --git a/docs/config-local-localai.md b/docs/config-local-localai.md index 0ac763147..fcd30894b 100644 --- a/docs/config-local-localai.md +++ b/docs/config-local-localai.md @@ -36,6 +36,9 @@ Follow the guide at: https://localai.io/basics/getting_started/ - Load the models (click on `Models 🔄`) - Select the model and chat +In addition to using the UI, configuration can also be done using +[environment variables](environment-variables.md). + ### Integration: Models Gallery If the running LocalAI instance is configured with a [Model Gallery](https://localai.io/models/): diff --git a/docs/config-local-ollama.md b/docs/config-local-ollama.md index 0d526308c..fc507e421 100644 --- a/docs/config-local-ollama.md +++ b/docs/config-local-ollama.md @@ -22,6 +22,9 @@ _Last updated Dec 16, 2023_ you'll have to press the 'Pull' button again, until a green message appears. 5. **Chat with Ollama models**: select an Ollama model and begin chatting with AI personas +In addition to using the UI, configuration can also be done using +[environment variables](environment-variables.md). + **Visual Configuration Guide**: * After adding the `Ollama` model vendor, entering the IP address of an Ollama server, and refreshing models:
diff --git a/docs/config-openrouter.md b/docs/config-openrouter.md index 51bd7bada..12623c32e 100644 --- a/docs/config-openrouter.md +++ b/docs/config-openrouter.md @@ -22,6 +22,9 @@ This document details the process of integrating OpenRouter with big-AGI. ![feature-openrouter-configure.png](pixels/feature-openrouter-configure.png) 4. OpenAI GPT4-32k and other models will now be accessible and selectable in the application. +In addition to using the UI, configuration can also be done using +[environment variables](environment-variables.md). + ### Pricing OpenRouter independently manages its service and pricing and is not affiliated with big-AGI.