Skip to content

Commit be0bfd1

Browse files
Merge branch 'current' into style-changes-4
2 parents a3034eb + 8a2e86e commit be0bfd1

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

49 files changed

+1012
-86
lines changed

website/docs/docs/build/udfs.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ User-defined functions (UDFs) enable you to define and register custom functions
1010

1111
UDFs are particularly valuable for sharing logic across multiple tools, standardizing complex business calculations, improving performance for compute-intensive operations (since they're compiled and optimized by your warehouse's query engine), and version controlling custom logic within your dbt project.
1212

13-
dbt creates, updates, and renames UDFs as part of DAG execution. The UDF is built in the warehouse before the model that references it.
13+
dbt creates, updates, and renames UDFs as part of DAG execution. The UDF is built in the warehouse before the model that references it. Refer to [listing and selecting UDFs](/docs/build/udfs#listing-and-selecting-udfs) for more info on how to build UDFs in your project.
1414

1515
## Prerequisites
1616

@@ -222,11 +222,11 @@ unit_tests:
222222

223223
## Listing and selecting UDFs
224224

225-
To list UDFs in your project, run `dbt list --select "resource_type:function"` or `dbt list --resource-type function`.
225+
Use the [`list` command](/reference/commands/list#listing-functions) to list UDFs in your project: `dbt list --select "resource_type:function"` or `dbt list --resource-type function`.
226226

227-
To select UDFs when building a project, run `dbt build --select "resource_type:function"`.
227+
Use the [`build` command](/reference/commands/build#functions) to select UDFs when building a project: `dbt build --select "resource_type:function"`.
228228

229-
For more information about selecting UDFs, see the examples in [Node selector methods](/reference/node-selection/methods).
229+
For more information about selecting UDFs, see the examples in [Node selector methods](/reference/node-selection/methods#file).
230230

231231
## Limitations
232232
- Creating UDFs in other languages (for example, Python, Java, or Scala) is not yet supported.

website/docs/docs/cloud/connect-data-platform/connect-amazon-athena.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ Connect <Constant name="cloud" /> to Amazon's Athena interactive query service t
2323
| AWS S3 temp tables prefix | s3_tmp_table_dir | Prefix for storing temporary tables, if different from the connection's s3_data_dir | String | Optional | s3://bucket3/dbt/ |
2424
| Poll interval | poll_interval | Interval in seconds to use for polling the status of query results in Athena | Integer| Optional | 5 |
2525
| Query retries | num_retries | Number of times to retry a failing query | Integer| Optional | 3 |
26-
| Boto3 retries | num_boto3_retries| Number of times to retry boto3 requests (e.g. deleting S3 files for materialized tables)| Integer | Optional | 5 |
26+
| Boto3 retries | num_boto3_retries| Number of times to retry boto3 requests (for example, deleting S3 files for materialized tables)| Integer | Optional | 5 |
2727
| Iceberg retries | num_iceberg_retries| Number of times to retry iceberg commit queries to fix ICEBERG_COMMIT_ERROR | Integer | Optional | 0 |
2828

2929
### Development credentials

website/docs/docs/cloud/secure/az-postgres-private-link.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ From your Azure portal:
2929
- Postgres Flexible Server name:
3030
- Azure Database for Postgres Flexible Server resource ID:
3131
- dbt Azure multi-tenant environment (EMEA):
32-
- Azure Postgres server region (e.g., WestEurope, NorthEurope):
32+
- Azure Postgres server region (for example, WestEurope, NorthEurope):
3333
```
3434
5. Once our support team confirms the endpoint has been created, navigate to the Azure Database for Postgres Flexible Server in the Azure Portal and browse to **Settings** > **Networking**. In the **Private Endpoints** section, highlight the `dbt` named option and select **Approve**. Confirm with Support that the connection has been approved so they can validate the connection and make it available for use in <Constant name="cloud" />.
3535

website/docs/docs/cloud/secure/az-synapse-private-link.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ From your Azure portal:
2929
- Server name:
3030
- Azure Synapse workspace resource ID:
3131
- dbt Azure multi-tenant environment (EMEA):
32-
- Azure Synapse workspace region (e.g., WestEurope, NorthEurope):
32+
- Azure Synapse workspace region (for example, WestEurope, NorthEurope):
3333
```
3434
5. Once our support team confirms the the endpoint has been created, navigate to the Azure Synapse workspace in the Azure Portal and browse to **Security** > **Private endpoint connections**. In the **Private endpoint connections** table, highlight the `dbt` named option and select **Approve**. Confirm with Support that the connection has been approved so they can validate the connection and make it available for use in <Constant name="cloud" />.
3535

website/docs/docs/cloud/secure/databricks-private-link.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ The following steps will walk you through the setup of a Databricks Azure Privat
2626
Subject: New Azure Multi-Tenant Private Link Request
2727
- Type: Databricks
2828
- Databricks instance name:
29-
- Azure Databricks Workspace URL (e.g. adb-################.##.azuredatabricks.net)
29+
- Azure Databricks Workspace URL (for example, adb-################.##.azuredatabricks.net)
3030
- Databricks Azure resource ID:
3131
- dbt Azure multi-tenant environment (EMEA):
3232
- Azure Databricks workspace region (like WestEurope, NorthEurope):

website/docs/docs/cloud/secure/databricks-privatelink.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ The following steps will walk you through the setup of a Databricks AWS PrivateL
2828
Subject: New AWS Multi-Tenant PrivateLink Request
2929
- Type: Databricks
3030
- Databricks instance name:
31-
- Databricks cluster AWS Region (e.g., us-east-1, eu-west-2):
31+
- Databricks cluster AWS Region (for example, us-east-1, eu-west-2):
3232
- dbt AWS multi-tenant environment (US, EMEA, AU):
3333
```
3434
<PrivateLinkSLA />

website/docs/docs/cloud/secure/postgres-privatelink.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ Once the VPC Endpoint Service is provisioned, you can find the service name in t
7777
Subject: New Multi-Tenant PrivateLink Request
7878
- Type: Postgres Interface-type
7979
- VPC Endpoint Service Name:
80-
- Postgres server AWS Region (e.g., us-east-1, eu-west-2):
80+
- Postgres server AWS Region (for example, us-east-1, eu-west-2):
8181
- dbt AWS multi-tenant environment (US, EMEA, AU):
8282
```
8383

website/docs/docs/cloud/secure/redshift-privatelink.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ AWS provides two different ways to create a PrivateLink VPC endpoint for a Redsh
5050
- Type: Redshift-managed
5151
- Redshift cluster name:
5252
- Redshift cluster AWS account ID:
53-
- Redshift cluster AWS Region (e.g., us-east-1, eu-west-2):
53+
- Redshift cluster AWS Region (for example, us-east-1, eu-west-2):
5454
- <Constant name="cloud" /> multi-tenant environment (US, EMEA, AU):
5555
```
5656
@@ -60,7 +60,7 @@ AWS provides two different ways to create a PrivateLink VPC endpoint for a Redsh
6060
- Type: Redshift-managed - Serverless
6161
- Redshift workgroup name:
6262
- Redshift workgroup AWS account ID:
63-
- Redshift workgroup AWS Region (e.g., us-east-1, eu-west-2):
63+
- Redshift workgroup AWS Region (for example, us-east-1, eu-west-2):
6464
- <Constant name="cloud" /> multi-tenant environment (US, EMEA, AU):
6565
```
6666
@@ -125,7 +125,7 @@ Once the VPC Endpoint Service is provisioned, you can find the service name in t
125125
Subject: New Multi-Tenant PrivateLink Request
126126
- Type: Redshift Interface-type
127127
- VPC Endpoint Service Name:
128-
- Redshift cluster AWS Region (e.g., us-east-1, eu-west-2):
128+
- Redshift cluster AWS Region (for example, us-east-1, eu-west-2):
129129
- dbt AWS multi-tenant environment (US, EMEA, AU):
130130
```
131131

website/docs/docs/cloud/secure/vcs-privatelink.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ Subject: New Multi-Tenant PrivateLink Request
8686
- Custom DNS (if HTTPS)
8787
- Private hosted zone:
8888
- DNS record:
89-
- VCS install AWS Region (e.g., us-east-1, eu-west-2):
89+
- VCS install AWS Region (for example, us-east-1, eu-west-2):
9090
- dbt AWS multi-tenant environment (US, EMEA, AU):
9191
```
9292

website/docs/docs/cloud/use-dbt-copilot.md

Lines changed: 10 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ This page explains how to use <Constant name="copilot" /> to:
2121
- [Generate and edit SQL inline](#generate-and-edit-sql-inline) &mdash; Use natural language prompts to generate SQL code from scratch or to edit existing SQL file by using keyboard shortcuts or highlighting code in the [<Constant name="cloud_ide" />](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud).
2222
- [Build visual models](#build-visual-models) &mdash; Use <Constant name="copilot" /> to generate models in [<Constant name="visual_editor" />](/docs/cloud/use-canvas) with natural language prompts.
2323
- [Build queries](#build-queries) &mdash; Use <Constant name="copilot" /> to generate queries in [<Constant name="query_page" />](/docs/explore/dbt-insights) for exploratory data analysis using natural language prompts.
24-
- [Analyze data with the <Constant name="copilot" /> agent](#analyze-data-with-the-copilot-agent) &mdash; Use <Constant name="copilot" /> to analyze your data and get contextualized results in real time by asking a natural language question to the <Constant name="copilot" /> agent.
24+
- [Analyze data with the Analyst agent](#analyze-data-with-the-analyst-agent) &mdash; Use <Constant name="copilot" /> to analyze your data and get contextualized results in real time by asking a natural language question to the Analyst agent.
2525

2626
:::tip
2727
Check out our [dbt Copilot on-demand course](https://learn.getdbt.com/learn/course/dbt-copilot/welcome-to-dbt-copilot/welcome-5-mins) to learn how to use <Constant name="copilot" /> to generate resources, and more!
@@ -59,21 +59,23 @@ To begin building SQL queries with natural language prompts in <Constant name="q
5959

6060
<Lightbox src="/img/docs/dbt-insights/insights-copilot.gif" width="95%" title="dbt Copilot in dbt Insights" />
6161

62-
## Analyze data with the Copilot agent <Lifecycle status='private_beta' />
62+
## Analyze data with the Analyst agent <Lifecycle status='private_beta' />
6363

64-
Use dbt <Constant name="copilot" /> to analyze your data and get contextualized results in real time by asking natural language questions to the [<Constant name="query_page" />](/docs/explore/dbt-insights) <Constant name="copilot" /> agent. Before you begin, make sure you can [access <Constant name="query_page" />](/docs/explore/access-dbt-insights).
64+
Use dbt <Constant name="copilot" /> to analyze your data and get contextualized results in real time by asking natural language questions to the [<Constant name="query_page" />](/docs/explore/dbt-insights) Analyst agent. To request access to the Analyst agent, [join the waitlist](https://www.getdbt.com/product/dbt-agents#dbt-Agents-signup).
65+
66+
Before you begin, make sure you can [access <Constant name="query_page" />](/docs/explore/access-dbt-insights).
6567

6668
1. Click the **<Constant name="copilot" />** icon in the Query console sidebar menu.
6769
2. Click **Agent**.
6870
3. In the dbt <Constant name="copilot" /> prompt box, enter your question.
6971
4. Click **** to submit your question.
7072

71-
The <Constant name="copilot" /> agent then translates natural language questions into structured queries, executes queries against governed dbt models and metrics, and returns results with references, assumptions, and possible next steps.
73+
The agent then translates natural language questions into structured queries, executes queries against governed dbt models and metrics, and returns results with references, assumptions, and possible next steps.
7274

73-
The <Constant name="copilot" /> agent can loop through these steps multiple times if it hasn't reached a complete answer, allowing for complex, multi-step analysis.⁠
75+
The agent can loop through these steps multiple times if it hasn't reached a complete answer, allowing for complex, multi-step analysis.⁠
7476

75-
5. Confirm the results or continue asking the <Constant name="copilot" /> agent for more insights about your data.
77+
5. Confirm the results or continue asking the agent for more insights about your data.
7678

77-
Your conversation with the <Constant name="copilot" /> agent remains even if you switch tabs within dbt <Constant name="query_page" />. However, they disappear when you navigate out of <Constant name="query_page" /> or when you close your browser.
79+
Your conversation with the agent remains even if you switch tabs within dbt <Constant name="query_page" />. However, they disappear when you navigate out of <Constant name="query_page" /> or when you close your browser.
7880

79-
<Lightbox src="/img/docs/dbt-insights/insights-copilot-agent.png" width="60%" title="Using the Copilot agent in Insights" />
81+
<Lightbox src="/img/docs/dbt-insights/insights-copilot-agent.png" width="60%" title="Using the Analyst agent in Insights" />

0 commit comments

Comments
 (0)