diff --git a/docs/how-tos/vs-code/datacoves-copilot/v2.md b/docs/how-tos/vs-code/datacoves-copilot/v2.md index c923db2..41269e3 100644 --- a/docs/how-tos/vs-code/datacoves-copilot/v2.md +++ b/docs/how-tos/vs-code/datacoves-copilot/v2.md @@ -7,32 +7,17 @@ sidebar_position: 6 This section describes how to configure and use Datacoves Copilot v2, which comes installed on Datacoves v4+, enhancing the experience and supporting the following LLM providers: -- Anthropic -- DeepSeek -- Google Gemini -- OpenAI -- Azure OpenAI -- OpenAI Compatible -- Open Router -- xAI (Grok) +- [Anthropic](#anthropic-llm-provider) +- [Azure OpenAI](#azure-openai-llm-provider) +- [DeepSeek](#deepseek-llm-provider) +- [Google Gemini](#google-gemini-llm-provider) +- [OpenAI](#openai-llm-provider) +- [OpenAI Compatible](#openai-compatible-llm-providers) +- [Open Router](#open-router-llm-provider) +- [xAI (Grok)](#xai-grok-llm-provider) ## How Tos -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; - - - - - ## Configure your LLM in Datacoves Copilot v2 ## Create a Datacoves Secret @@ -42,7 +27,7 @@ Creating a [Datacoves Secret](/how-tos/datacoves/how_to_secrets.md) requires som - **Name:** The secret must be named `datacoves-copilot-api-configs` - **Description:** Provide a simple description such as: `Datacoves Copilot config` - **Format:** Select `Raw JSON` -- **Value**: The value will vary depending on the LLM you are utilizing, click on the corresponding tab or on `More Providers` to see more. +- **Value**: The value will vary depending on the LLM you are utilizing, see the provider sections below. - **Scope:** Select the desired scope, either `Project` or `Environment`. - **Project/Environment:** Select the `Project` or `Environment` that will access this LLM. @@ -77,9 +62,7 @@ Lastly, be sure to toggle on the `Share with developers` option so that users wi allowfullscreen > - - - +## LLM Providers ## Anthropic LLM Provider @@ -128,9 +111,6 @@ Datacoves Copilot supports the following Anthropic Claude models: See [Anthropic's Model Documentation](https://docs.claude.com/en/docs/about-claude/models/overview) for more details on each model's capabilities. - - - ## OpenAI LLM Provider Datacoves Copilot supports accessing models directly through the official OpenAI API, including the latest GPT-5 family with advanced features like reasoning effort control and verbosity settings. @@ -221,9 +201,6 @@ Optimized GPT-4 models: Refer to the [OpenAI Models documentation](https://platform.openai.com/docs/models) for the most up-to-date list of models and capabilities. - - - ## Azure OpenAI LLM Provider Datacoves Copilot supports Azure OpenAI models through the OpenAI API compatible interface. @@ -324,11 +301,6 @@ Use your **deployment name** from Azure AI Foundry as the `openAiModelId`. This Refer to [Azure OpenAI documentation](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/concepts/models) for the most current model availability and regional deployment options. - - - -#### OpenAI Compatible - ## OpenAI Compatible LLM Providers Datacoves Copilot supports a wide range of AI model providers that offer APIs compatible with the OpenAI API standard. This means you can use models from providers other than OpenAI, while still using a familiar API interface. This includes providers like: @@ -337,7 +309,7 @@ Datacoves Copilot supports a wide range of AI model providers that offer APIs co - Cloud providers like Perplexity, Together AI, Anyscale, and others. - Any other provider offering an OpenAI-compatible API endpoint. -**Note:** For Azure OpenAI, see the dedicated [Azure OpenAI tab](#azure-openai) for specific setup instructions. +**Note:** For Azure OpenAI, see the dedicated [Azure OpenAI](#azure-openai-llm-provider) section for specific setup instructions. ### Secret value format @@ -381,9 +353,6 @@ Fine tune model usage using this additional configuration under the `openAiCusto } ``` - - - ## Google Gemini LLM Provider Datacoves Copilot supports Google's Gemini family of models through the Google AI Gemini API. @@ -436,20 +405,13 @@ Datacoves Copilot supports the following Gemini models: Refer to the [Gemini documentation](https://ai.google.dev/gemini-api/docs/models) for more details on each model. - - - -## Additional LLM Providers - -Datacoves Copilot supports additional LLM providers, you can find the secret value format for each one of them and additional documentation. - -### DeepSeek +## DeepSeek LLM Provider Datacoves Copilot supports accessing models through the DeepSeek API, including deepseek-chat and deepseek-reasoner. Website: https://platform.deepseek.com/ -#### Secret value format +### Secret value format ```json { @@ -462,26 +424,26 @@ Website: https://platform.deepseek.com/ } ``` -#### Getting an API Key +### Getting an API Key 1. Sign Up/Sign In: Go to the DeepSeek Platform. Create an account or sign in. 2. Navigate to API Keys: Find your API keys in the API keys section of the platform. 3. Create a Key: Click "Create new API key". Give your key a descriptive name (e.g., "Datacoves"). 4. Copy the Key: Important: Copy the API key immediately. You will not be able to see it again. Store it securely. -#### Supported Models (`apiModelId`) +### Supported Models (`apiModelId`) - deepseek-chat (Recommended for coding tasks) - deepseek-reasoner (Recommended for reasoning tasks) - deepseek-r1 -### Open Router +## Open Router LLM Provider OpenRouter is an AI platform that provides access to a wide variety of language models from different providers, all through a single API. This can simplify setup and allow you to easily experiment with different models. Website: https://openrouter.ai/ -#### Secret value format +### Secret value format ```json { @@ -495,25 +457,25 @@ Website: https://openrouter.ai/ } ``` -#### Getting an API Key +### Getting an API Key 1. Sign Up/Sign In: Go to the OpenRouter website. Sign in with your Google or GitHub account. 2. Get an API Key: Go to the keys page. You should see an API key listed. If not, create a new key. 3. Copy the Key: Copy the API key. -#### Supported Models (`openRouterModelId`) +### Supported Models (`openRouterModelId`) OpenRouter supports a large and growing number of models. Refer to the [OpenRouter Models page](https://openrouter.ai/models) for the complete and up-to-date list. -### xAI Grok +## xAI Grok LLM Provider xAI is the company behind Grok, a large language model known for its conversational abilities and large context window. Grok models are designed to provide helpful, informative, and contextually relevant responses. Website: https://x.ai/ -#### Secret value format +### Secret value format ```json { @@ -527,14 +489,14 @@ Website: https://x.ai/ } ``` -#### Getting an API Key +### Getting an API Key 1. Sign Up/Sign In: Go to the xAI Console. Create an account or sign in. 2. Navigate to API Keys: Go to the API keys section in your dashboard. 3. Create a Key: Click to create a new API key. Give your key a descriptive name (e.g., "Datacoves"). 4. Copy the Key: Important: Copy the API key immediately. You will not be able to see it again. Store it securely. -#### Supported Models (`apiModelId`) +### Supported Models (`apiModelId`) - grok-code-fast-1 (Default) - xAI's Grok Code Fast model with 262K context window and prompt caching, optimized for reasoning and coding tasks - grok-4 - xAI's Grok-4 model with 262K context window, image support, and prompt caching @@ -546,7 +508,3 @@ Website: https://x.ai/ - grok-2-vision-1212 - xAI's Grok-2 Vision model (version 1212) with image support and 32K context window Learn more about available models at [xAI Docs](https://docs.x.ai/docs/models). - - - -