diff --git a/website/docs/docs/build/sql-models.md b/website/docs/docs/build/sql-models.md
index 87e063cdcdb..a019508d370 100644
--- a/website/docs/docs/build/sql-models.md
+++ b/website/docs/docs/build/sql-models.md
@@ -260,7 +260,7 @@ Additionally, the `ref` function encourages you to write modular transformations
## Testing and documenting models
-You can also document and test models — skip ahead to the section on [testing](/docs/build/data-tests) and [documentation](/docs/collaborate/documentation) for more information.
+You can also document and test models — skip ahead to the section on [testing](/docs/build/data-tests) and [documentation](/docs/build/documentation) for more information.
## Additional FAQs
diff --git a/website/docs/docs/cloud/dbt-assist.md b/website/docs/docs/cloud/dbt-assist.md
index cac5457812a..eafe7d05821 100644
--- a/website/docs/docs/cloud/dbt-assist.md
+++ b/website/docs/docs/cloud/dbt-assist.md
@@ -8,7 +8,7 @@ pagination_prev: null
# About dbt Assist
-dbt Assist is a powerful artificial intelligence (AI) co-pilot feature that helps automate development in dbt Cloud, allowing you to focus on delivering data that works. dbt Assist’s AI co-pilot generates documentation and tests for your dbt SQL models directly in the dbt Cloud IDE, with a click of a button, and helps you accomplish more in less time.
+dbt Assist is a powerful artificial intelligence (AI) co-pilot feature that helps automate development in dbt Cloud, allowing you to focus on delivering data that works. dbt Assist’s AI co-pilot generates [documentation](/docs/build/documentation) and [tests](/docs/build/data-tests) for your dbt SQL models directly in the dbt Cloud IDE, with a click of a button, and helps you accomplish more in less time.
:::tip Beta feature
dbt Assist is an AI tool meant to _help_ developers generate documentation and tests in dbt Cloud. It's available in beta, in the dbt Cloud IDE only.
diff --git a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md
index 1e561b379b4..e2fb122cba3 100644
--- a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md
+++ b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md
@@ -131,7 +131,7 @@ Nice job, you're ready to start developing and building models 🎉!
- **Generate your YAML configurations with dbt Assist** — [dbt Assist](/docs/cloud/dbt-assist) is a powerful artificial intelligence (AI) co-pilot feature that helps automate development in dbt Cloud. It generates documentation and tests for your dbt SQL models directly in the dbt Cloud IDE, with a click of a button, and helps you accomplish more in less time. Available for dbt Cloud Enterprise plans.
-- **Build and view your project's docs** — The dbt Cloud IDE makes it possible to [build and view](/docs/collaborate/build-and-view-your-docs#generating-documentation) documentation for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production.
+- **Build and view your project's docs** — The dbt Cloud IDE makes it possible to [build and view](/docs/collaborate/build-and-view-your-docs) documentation for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production.
## Related docs
diff --git a/website/docs/docs/collaborate/build-and-view-your-docs.md b/website/docs/docs/collaborate/build-and-view-your-docs.md
new file mode 100644
index 00000000000..ad43795a38c
--- /dev/null
+++ b/website/docs/docs/collaborate/build-and-view-your-docs.md
@@ -0,0 +1,85 @@
+---
+title: "Build and view your docs with dbt Cloud"
+id: "build-and-view-your-docs"
+description: "Automatically generate project documentation as you run jobs."
+pagination_next: null
+---
+
+dbt Cloud enables you to generate documentation for your project and data platform. The documentation is automatically updated with new information after a fully successful job run, ensuring accuracy and relevance.
+
+The default documentation experience in dbt Cloud is [dbt Explorer](/docs/collaborate/explore-projects), available on [Team or Enterprise plans](https://www.getdbt.com/pricing/). Use [dbt Explorer](/docs/collaborate/explore-projects) to view your project's resources (such as models, tests, and metrics) and their lineage to gain a better understanding of its latest production state.
+
+Refer to [documentation](/docs/build/documentation) for more configuration details.
+
+This shift makes [dbt Docs](#dbt-docs) a legacy documentation feature in dbt Cloud. dbt Docs is still accessible and offers basic documentation, but it doesn't offer the same speed, metadata, or visibility as dbt Explorer. dbt Docs is available to dbt Cloud developer plans or dbt Core users.
+
+## Set up a documentation job
+
+dbt Explorer uses the [metadata](/docs/collaborate/explore-projects#generate-metadata) generated after each job run in the production or staging environment, ensuring it always has the latest project results. To view richer metadata, you can set up documentation for a job in dbt Cloud when you edit your job settings or create a new job.
+
+Configure the job to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) when it runs. If you want to view column and statistics for models, sources, and snapshots in dbt Explorer, then this step is necessary.
+
+To set up a job to generate docs:
+
+1. In the top left, click **Deploy** and select **Jobs**.
+2. Create a new job or select an existing job and click **Settings**.
+3. Under **Execution Settings**, select **Generate docs on run** and click **Save**.
+
+
+*Note, for dbt Docs users you need to configure the job to generate docs when it runs, then manually link that job to your project. Proceed to [configure project documentation](#configure-project-documentation) so your project generates the documentation when this job runs.*
+
+You can also add the [`dbt docs generate` command](/reference/commands/cmd-docs) to the list of commands in the job run steps. However, you can expect different outcomes when adding the command to the run steps compared to configuring a job selecting the **Generate docs on run** checkbox.
+
+Review the following options and outcomes:
+
+| Options | Outcomes |
+|--------| ------- |
+| **Select checkbox** | Select the **Generate docs on run** checkbox to automatically generate updated project docs each time your job runs. If that particular step in your job fails, the job can still be successful if all subsequent steps are successful. |
+| **Add as a run step** | Add `dbt docs generate` to the list of commands in the job run steps, in whatever order you prefer. If that particular step in your job fails, the job will fail and all subsequent steps will be skipped. |
+
+:::tip Tip — Documentation-only jobs
+
+To create and schedule documentation-only jobs at the end of your production jobs, add the `dbt compile` command in the **Commands** section.
+
+:::
+
+## dbt Docs
+
+dbt Docs, available on developer plans or dbt Core users, generates a website from your dbt project using the `dbt docs generate` command. It provides a central location to view your project's resources, such as models, tests, and lineage — and helps you understand the data in your warehouse.
+
+### Configure project documentation
+
+You configure project documentation to generate documentation when the job you set up in the previous section runs. In the project settings, specify the job that generates documentation artifacts for that project. Once you configure this setting, subsequent runs of the job will automatically include a step to generate documentation.
+
+1. Click the gear icon in the top right.
+2. Select **Account Settings**.
+3. Navigate to **Projects** and select the project that needs documentation.
+4. Click **Edit**.
+5. Under **Artifacts**, select the job that should generate docs when it runs and click **Save**.
+
+
+:::tip Use dbt Explorer for a richer documentation experience
+For a richer and more interactive experience, try out [dbt Explorer](/docs/collaborate/explore-projects), available on [Team or Enterprise plans](https://www.getdbt.com/pricing/). It includes map layers of your DAG, keyword search, interacts with the IDE, model performance, project recommendations, and more.
+:::
+
+### Generating documentation
+
+To generate documentation in the dbt Cloud IDE, run the `dbt docs generate` command in the **Command Bar** in the dbt Cloud IDE. This command will generate the documentation for your dbt project as it exists in development in your IDE session.
+
+After generating your documentation, you can click **Explore** in the navigation. This will take you to dbt Explorer, where you can view your project's resources and their lineage.
+
+
+
+After running `dbt docs generate` in the dbt Cloud IDE, click the icon above the file tree, to see the latest version of your documentation rendered in a new browser window.
+
+### View documentation
+
+Once you set up a job to generate documentation for your project, you can click **Explore** in the navigation and then click on **dbt Docs**. Your project's documentation should open. This link will always help you find the most recent version of your project's documentation in dbt Cloud.
+
+These generated docs always show the last fully successful run, which means that if you have any failed tasks, including tests, then you will not see changes to the docs by this run. If you don't see a fully successful run, then you won't see any changes to the documentation.
+
+The dbt Cloud IDE makes it possible to view [documentation](/docs/build/documentation) for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production.
+
+## Related docs
+- [Documentation](/docs/build/documentation)
+- [dbt Explorer](/docs/collaborate/explore-projects)
diff --git a/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md b/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md
deleted file mode 100644
index 0129b43f305..00000000000
--- a/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md
+++ /dev/null
@@ -1,68 +0,0 @@
----
-title: "Build and view your docs with dbt Cloud"
-id: "build-and-view-your-docs"
-description: "Automatically generate project documentation as you run jobs."
-pagination_next: null
----
-
-dbt Cloud enables you to generate documentation for your project and data platform, rendering it as a website. The documentation is only updated with new information after a fully successful job run, ensuring accuracy and relevance. Refer to [Documentation](/docs/collaborate/documentation) for more details.
-
-## Set up a documentation job
-
-You can set up documentation for a job in dbt Cloud when you edit your job settings or create a new job. You need to configure the job to generate docs when it runs, then link that job to your project.
-
-To set up a job to generate docs:
-
-1. In the top left, click **Deploy** and select **Jobs**.
-2. Create a new job or select an existing job and click **Settings**.
-3. Under "Execution Settings," select **Generate docs on run**.
-
-
-4. Click **Save**. Proceed to [configure project documentation](#configure-project-documentation) so your project generates the documentation when this job runs.
-
-You can also add `dbt docs generate` to the list of commands in the job run steps. However, you can expect different outcomes when adding the command to the run steps compared to configuring a job selecting the **Generate docs on run** checkbox (shown in previous steps).
-
-Review the following options and outcomes:
-
-| Options | Outcomes |
-|--------| ------- |
-| **Select checkbox** | Select the **Generate docs on run** checkbox to automatically generate updated project docs each time your job runs. If that particular step in your job fails, the job can still be successful if all subsequent steps are successful. |
-| **Add as a run step** | Add `dbt docs generate` to the list of commands in the job run steps, in whatever order you prefer. If that particular step in your job fails, the job will fail and all subsequent steps will be skipped. |
-
-:::tip Tip — Documentation-only jobs
-
-To create and schedule documentation-only jobs at the end of your production jobs, add the `dbt compile` command in the **Commands** section.
-
-:::
-
-## Configure project documentation
-
-You configure project documentation to generate documentation when the job you set up in the previous section runs. In the project settings, specify the job that generates documentation artifacts for that project. Once you configure this setting, subsequent runs of the job will automatically include a step to generate documentation.
-
-1. Click the gear icon in the top right.
-2. Select **Account Settings**.
-3. Navigate to **Projects** and select the project that needs documentation.
-4. Click **Edit**.
-5. Under **Artifacts**, select the job that should generate docs when it runs.
-
-6. Click **Save**.
-
-## Generating documentation
-
-To generate documentation in the dbt Cloud IDE, run the `dbt docs generate` command in the
-Command Bar in the dbt Cloud IDE. This command will generate the Docs for your dbt project as it exists in development in your IDE session.
-
-
-
-After generating your documentation, you can click the **Book** icon above the file tree, to see the latest version of your documentation rendered in a new browser window.
-
-## Viewing documentation
-
-Once you set up a job to generate documentation for your project, you can click **Documentation** in the top left. Your project's documentation should open. This link will always help you find the most recent version of your project's documentation in dbt Cloud.
-
-These generated docs always show the last fully successful run, which means that if you have any failed tasks, including tests, then you will not see changes to the docs by this run. If you don't see a fully successful run, then you won't see any changes to the documentation.
-
-The dbt Cloud IDE makes it possible to view [documentation](/docs/collaborate/documentation)
-for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production.
-
-
diff --git a/website/docs/docs/collaborate/collaborate-with-others.md b/website/docs/docs/collaborate/collaborate-with-others.md
index 7875a8044b6..c8c8bd4657f 100644
--- a/website/docs/docs/collaborate/collaborate-with-others.md
+++ b/website/docs/docs/collaborate/collaborate-with-others.md
@@ -8,7 +8,7 @@ pagination_prev: null
@@ -26,7 +26,7 @@ pagination_prev: null
-
\ No newline at end of file
+
diff --git a/website/docs/docs/collaborate/explore-projects.md b/website/docs/docs/collaborate/explore-projects.md
index a92d5a69ad1..aa549520f34 100644
--- a/website/docs/docs/collaborate/explore-projects.md
+++ b/website/docs/docs/collaborate/explore-projects.md
@@ -1,8 +1,8 @@
---
-title: "Explore your dbt projects"
-sidebar_label: "Explore dbt projects"
-description: "Learn about dbt Explorer and how to interact with it to understand, improve, and leverage your data pipelines."
-pagination_next: "docs/collaborate/model-performance"
+title: "Discover data with dbt Explorer"
+sidebar_label: "Discover data with dbt Explorer"
+description: "Learn about dbt Explorer and how to interact with it to understand, improve, and leverage your dbt projects."
+pagination_next: "docs/collaborate/column-level-lineage"
pagination_prev: null
---
@@ -12,28 +12,30 @@ With dbt Explorer, you can view your project's [resources](/docs/build/projects)
- You have a dbt Cloud account on the [Team or Enterprise plan](https://www.getdbt.com/pricing/).
- You have set up a [production](/docs/deploy/deploy-environments#set-as-production-environment) or [staging](/docs/deploy/deploy-environments#create-a-staging-environment) deployment environment for each project you want to explore.
- - There has been at least one successful job run in the deployment environment. Note that [CI jobs](/docs/deploy/ci-jobs) do not update dbt Explorer.
-- You are on the dbt Explorer page. To do this, select **Explore** from the top navigation bar in dbt Cloud.
+- You have at least one successful job run in the deployment environment. Note that [CI jobs](/docs/deploy/ci-jobs) do not update dbt Explorer.
+- You are on the dbt Explorer page. To do this, select **Explore** from the navigation in dbt Cloud.
+
-## Generate metadata
+## Generate metadata
-dbt Explorer uses the metadata provided by the [Discovery API](/docs/dbt-cloud-apis/discovery-api) to display the details about [the state of your project](/docs/dbt-cloud-apis/project-state). The metadata that's available depends on the [deployment environment](/docs/deploy/deploy-environments) you've designated as _production_ or _staging_ in your dbt Cloud project. dbt Explorer automatically retrieves the metadata updates after each job run in the production or staging deployment environment so it always has the latest results for your project.
+dbt Explorer uses the metadata provided by the [Discovery API](/docs/dbt-cloud-apis/discovery-api) to display the details about [the state of your project](/docs/dbt-cloud-apis/project-state). The metadata that's available depends on the [deployment environment](/docs/deploy/deploy-environments) you've designated as _production_ or _staging_ in your dbt Cloud project.
-Note that CI jobs do not update dbt Explorer. This is because they don't reflect the production state and don't provide the necessary metadata updates.
-
-To view a resource and its metadata, you must define the resource in your project and run a job in the production or staging environment. The resulting metadata depends on the [commands](/docs/deploy/job-commands) executed by the jobs.
+- dbt Explorer automatically retrieves the metadata updates after each job run in the production or staging deployment environment so it always has the latest results for your project. This includes deploy and merge jobs.
+- Note that CI jobs do not update dbt Explorer. This is because they don't reflect the production state and don't provide the necessary metadata updates.
+- To view a resource and its metadata, you must define the resource in your project and run a job in the production or staging environment.
+- The resulting metadata depends on the [commands](/docs/deploy/job-commands) executed by the jobs.
| To view in Explorer | You must successfully run |
|---------------------|---------------------------|
| Model lineage, details, or results | [dbt run](/reference/commands/run) or [dbt build](/reference/commands/build) on a given model within a job in the environment |
-| Columns and statistics for models, sources, and snapshots| [dbt docs generate](/reference/commands/cmd-docs) within a job in the environment |
+| Columns and statistics for models, sources, and snapshots| [dbt docs generate](/reference/commands/cmd-docs) within [a job](/docs/collaborate/build-and-view-your-docs) in the environment |
| Test results | [dbt test](/reference/commands/test) or [dbt build](/reference/commands/build) within a job in the environment |
| Source freshness results | [dbt source freshness](/reference/commands/source#dbt-source-freshness) within a job in the environment |
| Snapshot details | [dbt snapshot](/reference/commands/snapshot) or [dbt build](/reference/commands/build) within a job in the environment |
| Seed details | [dbt seed](/reference/commands/seed) or [dbt build](/reference/commands/build) within a job in the environment |
-Richer and more timely metadata will become available as dbt Cloud evolves.
+Richer and more timely metadata will become available as dbt Cloud evolves.
## Explore your project's lineage graph {#project-lineage}
@@ -184,7 +186,7 @@ In the upper right corner of the resource details page, you can:
- **Status bar** (below the page title) — Information on the last time the model ran, whether the run was successful, how the data is materialized, number of rows, and the size of the model.
- **General** tab includes:
- **Lineage** graph — The model’s lineage graph that you can interact with. The graph includes one upstream node and one downstream node from the model. Click the Expand icon in the graph's upper right corner to view the model in full lineage graph mode.
- - **Description** section — A [description of the model](/docs/collaborate/documentation#adding-descriptions-to-your-project).
+ - **Description** section — A [description of the model](/docs/build/documentation#adding-descriptions-to-your-project).
- **Recent** section — Information on the last time the model ran, how long it ran for, whether the run was successful, the job ID, and the run ID.
- **Tests** section — [Tests](/docs/build/data-tests) for the model, including a status indicator for the latest test status. A :white_check_mark: denotes a passing test.
- **Details** section — Key properties like the model’s relation name (for example, how it’s represented and how you can query it in the data platform: `database.schema.identifier`); model governance attributes like access, group, and if contracted; and more.
diff --git a/website/docs/docs/dbt-cloud-apis/discovery-api.md b/website/docs/docs/dbt-cloud-apis/discovery-api.md
index 438cf431060..0345c647dd9 100644
--- a/website/docs/docs/dbt-cloud-apis/discovery-api.md
+++ b/website/docs/docs/dbt-cloud-apis/discovery-api.md
@@ -50,7 +50,8 @@ Use the API to find and understand dbt assets in integrated tools using informat
Data producers must manage and organize data for stakeholders, while data consumers need to quickly and confidently analyze data on a large scale to make informed decisions that improve business outcomes and reduce organizational overhead. The API is useful for discovery data experiences in catalogs, analytics, apps, and machine learning (ML) tools. It can help you understand the origin and meaning of datasets for your analysis.
-
+
+
diff --git a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md
index f14fd03a534..38cc7c69b6a 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md
@@ -69,7 +69,7 @@ can override schema test definitions
- [`full_refresh` config](/reference/resource-configs/full_refresh)
**Docs**
-- [project-level overviews](/docs/collaborate/documentation#custom-project-level-overviews)
+- [project-level overviews](/docs/build/documentation#custom-project-level-overviews)
**Redshift**
- [`iam_profile`](/docs/core/connect-data-platform/redshift-setup#specifying-an-iam-profile)
diff --git a/website/docs/docs/deploy/artifacts.md b/website/docs/docs/deploy/artifacts.md
index 9b3ae71e79c..cff36bfafba 100644
--- a/website/docs/docs/deploy/artifacts.md
+++ b/website/docs/docs/deploy/artifacts.md
@@ -4,13 +4,23 @@ id: "artifacts"
description: "Use artifacts to power your automated docs site and source freshness data."
---
-When running dbt jobs, dbt Cloud generates and saves *artifacts*. You can use these artifacts, like `manifest.json`, `catalog.json`, and `sources.json` to power different aspects of dbt Cloud, namely: [dbt Docs](/docs/collaborate/documentation) and [source freshness reporting](/docs/build/sources#snapshotting-source-data-freshness).
+When running dbt jobs, dbt Cloud generates and saves *artifacts*. You can use these artifacts, like `manifest.json`, `catalog.json`, and `sources.json` to power different aspects of dbt Cloud, namely: [dbt Explorer](/docs/collaborate/explore-projects), [dbt Docs](/docs/collaborate/build-and-view-your-docs#dbt-docs), and [source freshness reporting](/docs/build/sources#snapshotting-source-data-freshness).
## Create dbt Cloud Artifacts
-While running any job can produce artifacts, you should only associate one production job with a given project to produce the project's artifacts. You can designate this connection in the **Project details** page. To access this page, click the gear icon in the upper right, select **Account Settings**, select your project, and click **Edit** in the lower right. Under **Artifacts**, select the jobs you want to produce documentation and source freshness artifacts for.
+[dbt Explorer](/docs/collaborate/explore-projects#generate-metadata) uses the metadata provided by the [Discovery API](/docs/dbt-cloud-apis/discovery-api) to display the details about [the state of your project](/docs/dbt-cloud-apis/project-state). It uses metadata from your staging and production [deployment environments](/docs/deploy/deploy-environments) (development environment metadata is coming soon).
-
+dbt Explorer automatically retrieves the metadata updates after each job run in the production or staging deployment environment so it always has the latest results for your project — meaning it's always automatically updated after each job run.
+
+To view a resource, its metadata, and what commands are needed, refer to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) for more details.
+
+
+
+The following steps are for legacy dbt Docs only. For the current documentation experience, see [dbt Explorer](/docs/collaborate/explore-projects).
+
+While running any job can produce artifacts, you should only associate one production job with a given project to produce the project's artifacts. You can designate this connection on the **Project details** page. To access this page, click the gear icon in the upper right, select **Account Settings**, select your project, and click **Edit** in the lower right. Under **Artifacts**, select the jobs you want to produce documentation and source freshness artifacts for.
+
+
If you don't see your job listed, you might need to edit the job and select **Run source freshness** and **Generate docs on run**.
@@ -18,17 +28,30 @@ If you don't see your job listed, you might need to edit the job and select **Ru
When you add a production job to a project, dbt Cloud updates the content and provides links to the production documentation and source freshness artifacts it generated for that project. You can see these links by clicking **Deploy** in the upper left, selecting **Jobs**, and then selecting the production job. From the job page, you can select a specific run to see how artifacts were updated for that run only.
+
+
### Documentation
-When set up, dbt Cloud updates the **Documentation** link in the header tab so it links to documentation for this job. This link always directs you to the latest version of the documentation for your project.
+Navigate to [dbt Explorer](/docs/collaborate/explore-projects) through the **Explore** link to view your project's resources and lineage to gain a better understanding of its latest production state.
-Note that both the job's commands and the docs generate step (triggered by the **Generate docs on run** checkbox) must succeed during the job invocation for the project-level documentation to be populated or updated.
+To view a resource, its metadata, and what commands are needed, refer to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) for more details.
+Both the job's commands and the docs generate step (triggered by the **Generate docs on run** checkbox) must succeed during the job invocation to update the documentation.
-
+
+
+When set up, dbt Cloud updates the Documentation link in the header tab so it links to documentation for this job. This link always directs you to the latest version of the documentation for your project.
+
+
### Source Freshness
-As with Documentation, configuring a job for the Source Freshness artifact setting also updates the Data Sources link under **Deploy**. The new link points to the latest Source Freshness report for the selected job.
+To view the latest source freshness result, refer to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) for more detail. Then navigate to dbt Explorer through the **Explore** link.
+
+
+
+Configuring a job for the Source Freshness artifact setting also updates the data source link under **Deploy**. The new link points to the latest Source Freshness report for the selected job.
+
+
diff --git a/website/docs/docs/deploy/job-commands.md b/website/docs/docs/deploy/job-commands.md
index 26fe1931db6..8117178b2d6 100644
--- a/website/docs/docs/deploy/job-commands.md
+++ b/website/docs/docs/deploy/job-commands.md
@@ -35,7 +35,7 @@ Every job invocation automatically includes the [`dbt deps`](/reference/commands
For every job, you have the option to select the [Generate docs on run](/docs/collaborate/build-and-view-your-docs) or [Run source freshness](/docs/deploy/source-freshness) checkboxes, enabling you to run the commands automatically.
-**Job outcome Generate docs on run checkbox** — dbt Cloud executes the `dbt docs generate` command, _after_ the listed commands. If that particular run step in your job fails, the job can still succeed if all subsequent run steps are successful. Read [Build and view your docs](/docs/collaborate/build-and-view-your-docs) for more info.
+**Job outcome Generate docs on run checkbox** — dbt Cloud executes the `dbt docs generate` command, _after_ the listed commands. If that particular run step in your job fails, the job can still succeed if all subsequent run steps are successful. Read [Set up documentation job](/docs/collaborate/build-and-view-your-docs) for more info.
**Job outcome Source freshness checkbox** — dbt Cloud executes the `dbt source freshness` command as the first run step in your job. If that particular run step in your job fails, the job can still succeed if all subsequent run steps are successful. Read [Source freshness](/docs/deploy/source-freshness) for more info.
diff --git a/website/docs/docs/deploy/source-freshness.md b/website/docs/docs/deploy/source-freshness.md
index ab267b6d067..a409c01f82c 100644
--- a/website/docs/docs/deploy/source-freshness.md
+++ b/website/docs/docs/deploy/source-freshness.md
@@ -12,7 +12,7 @@ dbt Cloud provides a helpful interface around dbt's [source data freshness](/doc
[`dbt build`](reference/commands/build) does _not_ include source freshness checks when building and testing resources in your DAG. Instead, you can use one of these common patterns for defining jobs:
- Add `dbt build` to the run step to run models, tests, and so on.
-- Select the **Generate docs on run** checkbox to automatically [generate project docs](/docs/collaborate/build-and-view-your-docs#set-up-a-documentation-job).
+- Select the **Generate docs on run** checkbox to automatically [generate project docs](/docs/collaborate/build-and-view-your-docs).
- Select the **Run source freshness** checkbox to enable [source freshness](#checkbox) as the first step of the job.
@@ -42,4 +42,4 @@ It's important that your freshness jobs run frequently enough to snapshot data l
## Further reading
- Refer to [Artifacts](/docs/deploy/artifacts) for more info on how to create dbt Cloud artifacts, share links to the latest documentation, and share source freshness reports with your team.
-- Source freshness for Snowflake is calculated using the `LAST_ALTERED` column. Read about the limitations in [Snowflake configs](/reference/resource-configs/snowflake-configs#source-freshness-known-limitation).
\ No newline at end of file
+- Source freshness for Snowflake is calculated using the `LAST_ALTERED` column. Read about the limitations in [Snowflake configs](/reference/resource-configs/snowflake-configs#source-freshness-known-limitation).
diff --git a/website/docs/docs/introduction.md b/website/docs/docs/introduction.md
index 980915a2c42..5301dae396d 100644
--- a/website/docs/docs/introduction.md
+++ b/website/docs/docs/introduction.md
@@ -61,7 +61,7 @@ As a dbt user, your main focus will be on writing models (select queries) that r
| Handle boilerplate code to materialize queries as relations | For each model you create, you can easily configure a *materialization*. A materialization represents a build strategy for your select query – the code behind a materialization is robust, boilerplate SQL that wraps your select query in a statement to create a new, or update an existing, relation. Read more about [Materializations](/docs/build/materializations).|
| Use a code compiler | SQL files can contain Jinja, a lightweight templating language. Using Jinja in SQL provides a way to use control structures in your queries. For example, `if` statements and `for` loops. It also enables repeated SQL to be shared through `macros`. Read more about [Macros](/docs/build/jinja-macros).|
| Determine the order of model execution | Often, when transforming data, it makes sense to do so in a staged approach. dbt provides a mechanism to implement transformations in stages through the [ref function](/reference/dbt-jinja-functions/ref). Rather than selecting from existing tables and views in your warehouse, you can select from another model.|
-| Document your dbt project | In dbt Cloud, you can auto-generate the documentation when your dbt project runs. dbt provides a mechanism to write, version-control, and share documentation for your dbt models. You can write descriptions (in plain text or markdown) for each model and field. Read more about the [Documentation](/docs/collaborate/documentation).|
+| Document your dbt project | In dbt Cloud, you can auto-generate the documentation when your dbt project runs. dbt provides a mechanism to write, version-control, and share documentation for your dbt models. You can write descriptions (in plain text or markdown) for each model and field. Read more about the [Documentation](/docs/build/documentation).|
| Test your models | Tests provide a way to improve the integrity of the SQL in each model by making assertions about the results generated by a model. Build, test, and run your project with a button click or by using the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) command bar. Read more about writing tests for your models [Testing](/docs/build/data-tests)|
| Manage packages | dbt ships with a package manager, which allows analysts to use and publish both public and private repositories of dbt code which can then be referenced by others. Read more about [Package Management](/docs/build/packages). |
| Load seed files| Often in analytics, raw values need to be mapped to a more readable value (for example, converting a country-code to a country name) or enriched with static or infrequently changing data. These data sources, known as seed files, can be saved as a CSV file in your `project` and loaded into your data warehouse using the `seed` command. Read more about [Seeds](/docs/build/seeds).|
diff --git a/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md b/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md
index f1e631f0d78..9e254de92d8 100644
--- a/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md
+++ b/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md
@@ -8,7 +8,7 @@ You can run your dbt projects with [dbt Cloud](/docs/cloud/about-cloud/dbt-cloud
- **dbt Cloud**: A hosted application where you can develop directly from a web browser using the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud). It also natively supports developing using a command line interface, [dbt Cloud CLI](/docs/cloud/cloud-cli-installation). Among other features, dbt Cloud provides:
- Development environment to help you build, test, run, and [version control](/docs/collaborate/git-version-control) your project faster.
- - Share your [dbt project's documentation](/docs/collaborate/build-and-view-your-docs) with your team.
+ - Share your [dbt project's documentation](/docs/build/documentation) with your team.
- Integrates with the dbt Cloud IDE, allowing you to run development tasks and environment in the dbt Cloud UI for a seamless experience.
- The dbt Cloud CLI to develop and run dbt commands against your dbt Cloud development environment from your local command line.
- For more details, refer to [Develop dbt](/docs/cloud/about-develop-dbt).
diff --git a/website/docs/faqs/Docs/_category_.yaml b/website/docs/faqs/Docs/_category_.yaml
index 8c7925dcc15..0a9aa44fe56 100644
--- a/website/docs/faqs/Docs/_category_.yaml
+++ b/website/docs/faqs/Docs/_category_.yaml
@@ -1,10 +1,10 @@
# position: 2.5 # float position is supported
-label: 'dbt Docs'
+label: 'Documentation'
collapsible: true # make the category collapsible
collapsed: true # keep the category collapsed by default
className: red
link:
type: generated-index
- title: dbt Docs FAQs
+ title: Documentation FAQs
customProps:
- description: Frequently asked questions about dbt Docs
+ description: Frequently asked questions about documentation
diff --git a/website/docs/faqs/Docs/long-descriptions.md b/website/docs/faqs/Docs/long-descriptions.md
index cdf15a94120..ef410df0517 100644
--- a/website/docs/faqs/Docs/long-descriptions.md
+++ b/website/docs/faqs/Docs/long-descriptions.md
@@ -31,4 +31,5 @@ If you need more than a sentence to explain a model, you can:
* tempor incididunt ut labore et dolore magna aliqua.
```
-3. Use a [docs block](/docs/collaborate/documentation#using-docs-blocks) to write the description in a separate Markdown file.
+3. Use a [docs block](/docs/build/documentation#using-docs-blocks) to write the description in a separate Markdown file.
+b
diff --git a/website/docs/faqs/Docs/sharing-documentation.md b/website/docs/faqs/Docs/sharing-documentation.md
index 4c6e0e84f77..cff618586ea 100644
--- a/website/docs/faqs/Docs/sharing-documentation.md
+++ b/website/docs/faqs/Docs/sharing-documentation.md
@@ -1,8 +1,12 @@
---
-title: How do I share my documentation with my team members?
+title: How do I access documentation in dbt Explorer?
description: "Use read-only seats to share documentation"
-sidebar_label: 'Share documentation with teammates'
+sidebar_label: 'Access documentation in dbt Explorer'
id: sharing-documentation
---
-If you're using dbt Cloud to deploy your project, and have the [Team plan](https://www.getdbt.com/pricing/), you can have up to 5 read-only users, who will be able access the documentation for your project.
+If you're using dbt Cloud to deploy your project and have the [Team or Enterprise plan](https://www.getdbt.com/pricing/), you can use dbt Explorer to view your project's [resources](/docs/build/projects) (such as models, tests, and metrics) and their lineage to gain a better understanding of its latest production state.
+
+Access dbt Explorer in dbt Cloud by clicking the **Explore** link in the navigation. You can have up to 5 read-only users access the documentation for your project.
+
+dbt Cloud developer plan and dbt Core users can use [dbt Docs](/docs/collaborate/build-and-view-your-docs#dbt-docs), which generates basic documentation but it doesn't offer the same speed, metadata, or visibility as dbt Explorer.
diff --git a/website/docs/guides/building-packages.md b/website/docs/guides/building-packages.md
index cc1ee2f1d74..69f963049ad 100644
--- a/website/docs/guides/building-packages.md
+++ b/website/docs/guides/building-packages.md
@@ -108,7 +108,7 @@ The major exception to this is when working with data sources that benefit from
### Test and document your package
It's critical that you [test](/docs/build/data-tests) your models and sources. This will give your end users confidence that your package is actually working on top of their dataset as intended.
-Further, adding [documentation](/docs/collaborate/documentation) via descriptions will help communicate your package to end users, and benefit their stakeholders that use the outputs of this package.
+Further, adding [documentation](/docs/build/documentation) via descriptions will help communicate your package to end users, and benefit their stakeholders that use the outputs of this package.
### Include useful GitHub artifacts
Over time, we've developed a set of useful GitHub artifacts that make administering our packages easier for us. In particular, we ensure that we include:
- A useful README, that has:
@@ -172,4 +172,4 @@ The release notes should contain an overview of the changes introduced in the ne
Our package registry, [hub.getdbt.com](https://hub.getdbt.com/), gets updated by the [hubcap script](https://github.com/dbt-labs/hubcap). To add your package to hub.getdbt.com, create a PR on the [hubcap repository](https://github.com/dbt-labs/hubcap) to include it in the `hub.json` file.
-
\ No newline at end of file
+
diff --git a/website/docs/guides/core-cloud-2.md b/website/docs/guides/core-cloud-2.md
index fe9c7c60141..335b164d988 100644
--- a/website/docs/guides/core-cloud-2.md
+++ b/website/docs/guides/core-cloud-2.md
@@ -143,7 +143,7 @@ Once you’ve confirmed that dbt Cloud orchestration and CI/CD are working as ex
Familiarize your team with dbt Cloud's [features](/docs/cloud/about-cloud/dbt-cloud-features) and optimize development and deployment processes. Some key features to consider include:
- **Version management:** Manage [dbt versions](/docs/dbt-versions/upgrade-dbt-version-in-cloud) and ensure team collaboration with dbt Cloud's one-click feature, removing the hassle of manual updates and version discrepancies. You can go versionless by opting to **[Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version)** to always get the latest features and early access to new functionality for your dbt project.
- **Development tools**: Use the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) or [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) to build, test, run, and version control your dbt projects.
-- **Documentation and Source freshness:** Automate storage of [documentation](/docs/collaborate/documentation) and track [source freshness](/docs/deploy/source-freshness) in dbt Cloud, which streamlines project maintenance.
+- **Documentation and Source freshness:** Automate storage of [documentation](/docs/build/documentation) and track [source freshness](/docs/deploy/source-freshness) in dbt Cloud, which streamlines project maintenance.
- **Notifications and logs:** Receive immediate [notifications](/docs/deploy/monitor-jobs) for job failures, with direct links to the job details. Access comprehensive logs for all job runs to help with troubleshooting.
- **CI/CD:** Use dbt Cloud's [CI/CD](/docs/deploy/ci-jobs) feature to run your dbt projects in a temporary schema whenever new commits are pushed to open pull requests. This helps with catching bugs before deploying to production.
diff --git a/website/docs/guides/dbt-python-snowpark.md b/website/docs/guides/dbt-python-snowpark.md
index f6d54ee738f..8125f98d231 100644
--- a/website/docs/guides/dbt-python-snowpark.md
+++ b/website/docs/guides/dbt-python-snowpark.md
@@ -1858,7 +1858,7 @@ We are going to revisit 2 areas of our project to understand our documentation:
- `intermediate.md` file
- `dbt_project.yml` file
-To start, let’s look back at our `intermediate.md` file. We can see that we provided multi-line descriptions for the models in our intermediate models using [docs blocks](/docs/collaborate/documentation#using-docs-blocks). Then we reference these docs blocks in our `.yml` file. Building descriptions with doc blocks in Markdown files gives you the ability to format your descriptions with Markdown and are particularly helpful when building long descriptions, either at the column or model level. In our `dbt_project.yml`, we added `node_colors` at folder levels.
+To start, let’s look back at our `intermediate.md` file. We can see that we provided multi-line descriptions for the models in our intermediate models using [docs blocks](/docs/build/documentation#using-docs-blocks). Then we reference these docs blocks in our `.yml` file. Building descriptions with doc blocks in Markdown files gives you the ability to format your descriptions with Markdown and are particularly helpful when building long descriptions, either at the column or model level. In our `dbt_project.yml`, we added `node_colors` at folder levels.
1. To see all these pieces come together, execute this in the command bar:
@@ -1926,4 +1926,4 @@ Fantastic! You’ve finished the workshop! We hope you feel empowered in using b
For more help and information join our [dbt community Slack](https://www.getdbt.com/community/) which contains more than 50,000 data practitioners today. We have a dedicated slack channel #db-snowflake to Snowflake related content. Happy dbt'ing!
-
\ No newline at end of file
+
diff --git a/website/docs/guides/productionize-your-dbt-databricks-project.md b/website/docs/guides/productionize-your-dbt-databricks-project.md
index 33f25070bdb..bada787e01f 100644
--- a/website/docs/guides/productionize-your-dbt-databricks-project.md
+++ b/website/docs/guides/productionize-your-dbt-databricks-project.md
@@ -197,4 +197,4 @@ To get the most out of both tools, you can use the [persist docs config](/refere
- [Databricks + dbt Cloud Quickstart Guide](/guides/databricks)
- Reach out to your Databricks account team to get access to preview features on Databricks.
-
\ No newline at end of file
+
diff --git a/website/docs/reference/artifacts/catalog-json.md b/website/docs/reference/artifacts/catalog-json.md
index 44a3f980c60..54f0c93da90 100644
--- a/website/docs/reference/artifacts/catalog-json.md
+++ b/website/docs/reference/artifacts/catalog-json.md
@@ -7,7 +7,7 @@ sidebar_label: "Catalog"
**Produced by:** [`docs generate`](/reference/commands/cmd-docs)
-This file contains information from your about the tables and views produced and defined by the resources in your project. Today, dbt uses this file to populate metadata, such as column types and statistics, in the [docs site](/docs/collaborate/documentation).
+This file contains information from your about the tables and views produced and defined by the resources in your project. Today, dbt uses this file to populate metadata, such as column types and statistics, in the [docs site](/docs/collaborate/build-and-view-your-docs).
### Top-level keys
diff --git a/website/docs/reference/artifacts/dbt-artifacts.md b/website/docs/reference/artifacts/dbt-artifacts.md
index 5e801d31b16..8d3e1ae29e8 100644
--- a/website/docs/reference/artifacts/dbt-artifacts.md
+++ b/website/docs/reference/artifacts/dbt-artifacts.md
@@ -5,7 +5,7 @@ sidebar_label: "About dbt artifacts"
With every invocation, dbt generates and saves one or more *artifacts*. Several of these are files (`semantic_manifest.json`, `manifest.json`, `catalog.json`, `run_results.json`, and `sources.json`) that are used to power:
-- [documentation](/docs/collaborate/documentation)
+- [documentation](/docs/collaborate/build-and-view-your-docs)
- [state](/reference/node-selection/syntax#about-node-selection)
- [visualizing source freshness](/docs/build/sources#snapshotting-source-data-freshness)
diff --git a/website/docs/reference/artifacts/manifest-json.md b/website/docs/reference/artifacts/manifest-json.md
index 5a487f2f177..296b5250d5d 100644
--- a/website/docs/reference/artifacts/manifest-json.md
+++ b/website/docs/reference/artifacts/manifest-json.md
@@ -11,7 +11,7 @@ import ManifestVersions from '/snippets/_manifest-versions.md';
This single file contains a full representation of your dbt project's resources (models, tests, macros, etc), including all node configurations and resource properties. Even if you're only running some models or tests, all resources will appear in the manifest (unless they are disabled) with most of their properties. (A few node properties, such as `compiled_sql`, only appear for executed nodes.)
-Today, dbt uses this file to populate the [docs site](/docs/collaborate/documentation), and to perform [state comparison](/reference/node-selection/syntax#about-node-selection). Members of the community have used this file to run checks on how many models have descriptions and tests.
+Today, dbt uses this file to populate the [docs site](/docs/collaborate/build-and-view-your-docs), and to perform [state comparison](/reference/node-selection/syntax#about-node-selection). Members of the community have used this file to run checks on how many models have descriptions and tests.
### Top-level keys
diff --git a/website/docs/reference/artifacts/other-artifacts.md b/website/docs/reference/artifacts/other-artifacts.md
index c4e595782fc..75a4653d685 100644
--- a/website/docs/reference/artifacts/other-artifacts.md
+++ b/website/docs/reference/artifacts/other-artifacts.md
@@ -7,7 +7,7 @@ sidebar_label: "Other artifacts"
**Produced by:** [`docs generate`](/reference/commands/cmd-docs)
-This file is the skeleton of the [auto-generated dbt documentation website](/docs/collaborate/documentation). The contents of the site are populated by the [manifest](/reference/artifacts/manifest-json) and [catalog](catalog-json).
+This file is the skeleton of the [auto-generated dbt documentation website](/docs/collaborate/build-and-view-your-docs). The contents of the site are populated by the [manifest](/reference/artifacts/manifest-json) and [catalog](catalog-json).
Note: the source code for `index.json` comes from the [dbt-docs repo](https://github.com/dbt-labs/dbt-docs). Head over there if you want to make a bug report, suggestion, or contribution relating to the documentation site.
diff --git a/website/docs/reference/commands/cmd-docs.md b/website/docs/reference/commands/cmd-docs.md
index 176bd4106cd..60b3049ccf2 100644
--- a/website/docs/reference/commands/cmd-docs.md
+++ b/website/docs/reference/commands/cmd-docs.md
@@ -42,7 +42,7 @@ dbt docs generate --no-compile
Use the `--empty-catalog` argument to skip running the database queries to populate `catalog.json`. When this flag is provided, `dbt docs generate` will skip step (3) described above.
-This is not recommended for production environments, as it means that your documentation will be missing information gleaned from database metadata (the full set of columns in each table, and statistics about those tables). It can speed up `docs generate` in development, when you just want to visualize lineage and other information defined within your project. To learn how to build your documentation in dbt Cloud, refer to [build your docs in dbt Cloud](/docs/collaborate/build-and-view-your-docs#generating-documentation).
+This is not recommended for production environments, as it means that your documentation will be missing information gleaned from database metadata (the full set of columns in each table, and statistics about those tables). It can speed up `docs generate` in development, when you just want to visualize lineage and other information defined within your project. To learn how to build your documentation in dbt Cloud, refer to [build your docs in dbt Cloud](/docs/collaborate/build-and-view-your-docs).
**Example**:
```
diff --git a/website/docs/reference/dbt-jinja-functions/doc.md b/website/docs/reference/dbt-jinja-functions/doc.md
index 51ca6ad2059..ee0b75b2e19 100644
--- a/website/docs/reference/dbt-jinja-functions/doc.md
+++ b/website/docs/reference/dbt-jinja-functions/doc.md
@@ -5,7 +5,7 @@ id: "doc"
description: "Use the `doc` to reference docs blocks in description fields."
---
-The `doc` function is used to reference docs blocks in the description field of schema.yml files. It is analogous to the `ref` function. For more information, consult the [Documentation guide](/docs/collaborate/documentation).
+The `doc` function is used to reference docs blocks in the description field of schema.yml files. It is analogous to the `ref` function. For more information, consult the [Documentation guide](/docs/collaborate/build-and-view-your-docs).
Usage:
diff --git a/website/docs/reference/project-configs/docs-paths.md b/website/docs/reference/project-configs/docs-paths.md
index 910cfbb0cce..51ff5c5ccca 100644
--- a/website/docs/reference/project-configs/docs-paths.md
+++ b/website/docs/reference/project-configs/docs-paths.md
@@ -13,7 +13,7 @@ docs-paths: [directorypath]
## Definition
-Optionally specify a custom list of directories where [docs blocks](/docs/collaborate/documentation#docs-blocks) are located.
+Optionally specify a custom list of directories where [docs blocks](/docs/build/documentation#docs-blocks) are located.
## Default
diff --git a/website/docs/reference/resource-properties/description.md b/website/docs/reference/resource-properties/description.md
index fee1d50aaf3..ce0c7c42074 100644
--- a/website/docs/reference/resource-properties/description.md
+++ b/website/docs/reference/resource-properties/description.md
@@ -157,7 +157,7 @@ A user-defined description. Can be used to document:
- analyses, and analysis columns
- macros, and macro arguments
-These descriptions are used in the documentation website rendered by dbt (refer to [the documentation guide](/docs/collaborate/documentation) or [dbt Explorer](/docs/collaborate/explore-projects)).
+These descriptions are used in the documentation website rendered by dbt (refer to [the documentation guide](/docs/build/documentation) or [dbt Explorer](/docs/collaborate/explore-projects)).
Descriptions can include markdown, as well as the [`doc` jinja function](/reference/dbt-jinja-functions/doc).
diff --git a/website/docs/terms/dag.md b/website/docs/terms/dag.md
index 0216332d953..93e2956ebb3 100644
--- a/website/docs/terms/dag.md
+++ b/website/docs/terms/dag.md
@@ -79,7 +79,7 @@ Instead of manually auditing your DAG for best practices, the [dbt project evalu
## dbt and DAGs
-The marketing team at dbt Labs would be upset with us if we told you we think dbt actually stood for “dag build tool,” but one of the key elements of dbt is its ability to generate documentation and infer relationships between models. And one of the hallmark features of [dbt Docs](https://docs.getdbt.com/docs/collaborate/documentation) is the Lineage Graph (DAG) of your dbt project.
+The marketing team at dbt Labs would be upset with us if we told you we think dbt actually stood for “dag build tool,” but one of the key elements of dbt is its ability to generate documentation and infer relationships between models. And one of the hallmark features of [dbt Docs](https://docs.getdbt.com/docs/build/documentation) is the Lineage Graph (DAG) of your dbt project.
Whether you’re using dbt Core or Cloud, dbt docs and the Lineage Graph are available to all dbt developers. The Lineage Graph in dbt Docs can show a model or source’s entire lineage, all within a visual frame. Clicking within a model, you can view the Lineage Graph and adjust selectors to only show certain models within the DAG. Analyzing the DAG here is a great way to diagnose potential inefficiencies or lack of modularity in your dbt project.
diff --git a/website/docs/terms/dry.md b/website/docs/terms/dry.md
index ec1c9229567..04b83642a08 100644
--- a/website/docs/terms/dry.md
+++ b/website/docs/terms/dry.md
@@ -26,7 +26,7 @@ WET, which stands for “Write Everything Twice,” is the opposite of DRY. It's
Well, how would you know if your code isn't DRY enough? That’s kind of subjective and will vary by the norms set within your organization. That said, a good rule of thumb is [the Rule of Three](https://en.wikipedia.org/wiki/Rule_of_three_(writing)#:~:text=The%20rule%20of%20three%20is,or%20effective%20than%20other%20numbers.). This rule states that the _third_ time you encounter a certain pattern, you should probably abstract it into some reusable unit.
-There is, of course, a tradeoff between simplicity and conciseness in code. The more abstractions you create, the harder it can be for others to understand and maintain your code without proper documentation. So, the moral of the story is: DRY code is great as long as you [write great documentation.](https://docs.getdbt.com/docs/collaborate/documentation)
+There is, of course, a tradeoff between simplicity and conciseness in code. The more abstractions you create, the harder it can be for others to understand and maintain your code without proper documentation. So, the moral of the story is: DRY code is great as long as you [write great documentation.](https://docs.getdbt.com/docs/build/documentation)
### Save time & energy
diff --git a/website/docs/terms/primary-key.md b/website/docs/terms/primary-key.md
index fde3ff44ac7..c8fc327af0d 100644
--- a/website/docs/terms/primary-key.md
+++ b/website/docs/terms/primary-key.md
@@ -151,7 +151,7 @@ When we talk about testing our primary keys, we really mean testing their unique
2. For databases that don’t offer support and enforcement of primary keys, you’re going to need to regularly test that primary keys aren’t violating their golden rule of uniqueness and non-nullness. To do this, we recommend implementing a tool like dbt that allows you to define version-controlled and code-based tests on your data models. Using these tests, you should create