diff --git a/website/blog/2021-02-05-dbt-project-checklist.md b/website/blog/2021-02-05-dbt-project-checklist.md index 9820c279b0f..efa7ca61b0e 100644 --- a/website/blog/2021-02-05-dbt-project-checklist.md +++ b/website/blog/2021-02-05-dbt-project-checklist.md @@ -173,8 +173,8 @@ This post is the checklist I created to guide our internal work, and I’m shari Useful Links -* [FAQs for documentation](/docs/collaborate/documentation#faqs) -* [Doc blocks](/docs/collaborate/documentation#using-docs-blocks) +* [FAQs for documentation](/docs/build/documentation#faqs) +* [Doc blocks](/docs/build/documentation#using-docs-blocks) ## ✅ dbt Cloud specifics ---------------------------------------------------------------------------------------------------------------------------------------------------------- diff --git a/website/blog/2021-12-05-how-to-build-a-mature-dbt-project-from-scratch.md b/website/blog/2021-12-05-how-to-build-a-mature-dbt-project-from-scratch.md index 52b2746ca14..2375d31d448 100644 --- a/website/blog/2021-12-05-how-to-build-a-mature-dbt-project-from-scratch.md +++ b/website/blog/2021-12-05-how-to-build-a-mature-dbt-project-from-scratch.md @@ -87,7 +87,7 @@ The most important thing we’re introducing when your project is an infant is t * Introduce modularity with [{{ ref() }}](/reference/dbt-jinja-functions/ref) and [{{ source() }}](/reference/dbt-jinja-functions/source) -* [Document](/docs/collaborate/documentation) and [test](/docs/build/data-tests) your first models +* [Document](/docs/build/documentation) and [test](/docs/build/data-tests) your first models ![image alt text](/img/blog/building-a-mature-dbt-project-from-scratch/image_3.png) diff --git a/website/blog/2022-09-28-analyst-to-ae.md b/website/blog/2022-09-28-analyst-to-ae.md index bf19bbae59e..03a466ddf80 100644 --- a/website/blog/2022-09-28-analyst-to-ae.md +++ b/website/blog/2022-09-28-analyst-to-ae.md @@ -133,7 +133,7 @@ It’s much easier to keep to a naming guide when the writer has a deep understa If we want to know how certain logic was built technically, then we can reference the SQL code in dbt docs. If we want to know *why* a certain logic was built into that specific model, then that’s where we’d turn to the documentation. -- Example of not-so-helpful documentation ([dbt docs can](https://docs.getdbt.com/docs/collaborate/documentation) build this dynamically): +- Example of not-so-helpful documentation ([dbt docs can](https://docs.getdbt.com/docs/build/documentation) build this dynamically): - `Case when Zone = 1 and Level like 'A%' then 'True' else 'False' end as GroupB` - Example of better, more descriptive documentation (add to your dbt markdown file or column descriptions): - Group B is defined as Users in Zone 1 with a Level beginning with the letter 'A'. These users are accessing our new add-on product that began in Beta in August 2022. It's recommended to filter them out of the main Active Users metric. diff --git a/website/blog/2023-02-14-passing-the-dbt-certification-exam.md b/website/blog/2023-02-14-passing-the-dbt-certification-exam.md index dbd0b856fe9..2696f3550f7 100644 --- a/website/blog/2023-02-14-passing-the-dbt-certification-exam.md +++ b/website/blog/2023-02-14-passing-the-dbt-certification-exam.md @@ -25,7 +25,7 @@ In this article, two Montreal Analytics consultants, Jade and Callie, discuss th **J:** To prepare for the exam, I built up a practice dbt project. All consultants do this as part of Montreal Analytics onboarding process, and this project allowed me to practice implementing sources and tests, refactoring SQL models, and debugging plenty of error messages. Additionally, I reviewed the [Certification Study Guide](https://www.getdbt.com/assets/uploads/dbt_certificate_study_guide.pdf) and attended group learning sessions. -**C:** To prepare for the exam I reviewed the official dbt Certification Study Guide and the [official dbt docs](https://docs.getdbt.com/), and attended group study and learning sessions that were hosted by Montreal Analytics for all employees interested in taking the exam. As a group, we prioritized subjects that we felt less familiar with; for the first cohort of test takers this was mainly newer topics that haven’t yet become integral to a typical dbt project, such as [doc blocks](https://docs.getdbt.com/docs/collaborate/documentation#using-docs-blocks) and [configurations versus properties](https://docs.getdbt.com/reference/configs-and-properties). These sessions mainly covered the highlights and common “gotchas” that are experienced using these techniques. The sessions were moderated by a team member who had already successfully completed the dbt Certification, but operated in a very collaborative environment, so everyone could provide additional information, ask questions to the group, and provide feedback to other members of our certification taking group. +**C:** To prepare for the exam I reviewed the official dbt Certification Study Guide and the [official dbt docs](https://docs.getdbt.com/), and attended group study and learning sessions that were hosted by Montreal Analytics for all employees interested in taking the exam. As a group, we prioritized subjects that we felt less familiar with; for the first cohort of test takers this was mainly newer topics that haven’t yet become integral to a typical dbt project, such as [doc blocks](https://docs.getdbt.com/docs/build/documentation#using-docs-blocks) and [configurations versus properties](https://docs.getdbt.com/reference/configs-and-properties). These sessions mainly covered the highlights and common “gotchas” that are experienced using these techniques. The sessions were moderated by a team member who had already successfully completed the dbt Certification, but operated in a very collaborative environment, so everyone could provide additional information, ask questions to the group, and provide feedback to other members of our certification taking group. I felt comfortable with the breadth of my dbt knowledge and had familiarity with most topics. However in my day-to-day implementation, I am often reliant on documentation or copying and pasting specific configurations in order to get the correct settings. Therefore, my focus was on memorizing important criteria for *how to use* certain features, particularly on the order/nesting of how the key YAML files are configured (dbt_project.yml, table.yml, source.yml). @@ -75,4 +75,4 @@ Now, the first thing you must do when you’ve passed a test is to get yourself Standards and best practices are very important, but a test is a measure at a single point in time of a rapidly evolving industry. It’s also a measure of my test-taking abilities, my stress levels, and other things unrelated to my skill in data modeling; I wouldn’t be a good analyst if I didn’t recognize the faults of a measurement. I’m glad to have this check mark completed, but I will continue to stay up to date with changes, learn new data skills and techniques, and find ways to continue being a holistically helpful teammate to my colleagues and clients. -You can learn more about the dbt Certification [here](https://www.getdbt.com/blog/dbt-certification-program/). \ No newline at end of file +You can learn more about the dbt Certification [here](https://www.getdbt.com/blog/dbt-certification-program/). diff --git a/website/blog/2023-05-04-generating-dynamic-docs.md b/website/blog/2023-05-04-generating-dynamic-docs.md index 1e704178b0a..f41302144dc 100644 --- a/website/blog/2023-05-04-generating-dynamic-docs.md +++ b/website/blog/2023-05-04-generating-dynamic-docs.md @@ -215,7 +215,7 @@ Which in turn can be copy-pasted into a new `.yml` file. In our example, we writ ## Create docs blocks for the new columns -[Docs blocks](https://docs.getdbt.com/docs/collaborate/documentation#using-docs-blocks) can be utilized to write more DRY and robust documentation. To use docs blocks, update your folder structure to contain a `.md` file. Your file structure should now look like this: +[Docs blocks](https://docs.getdbt.com/docs/build/documentation#using-docs-blocks) can be utilized to write more DRY and robust documentation. To use docs blocks, update your folder structure to contain a `.md` file. Your file structure should now look like this: ``` models/core/activity_based_interest diff --git a/website/docs/best-practices/how-we-structure/6-the-rest-of-the-project.md b/website/docs/best-practices/how-we-structure/6-the-rest-of-the-project.md index 4082f92b932..8e38648e43a 100644 --- a/website/docs/best-practices/how-we-structure/6-the-rest-of-the-project.md +++ b/website/docs/best-practices/how-we-structure/6-the-rest-of-the-project.md @@ -50,7 +50,7 @@ When structuring your YAML configuration files in a dbt project, you want to bal - The leading underscore ensures your YAML files will be sorted to the top of every folder to make them easy to separate from your models. - YAML files don’t need unique names in the way that SQL model files do, but including the directory (instead of simply `_sources.yml` in each folder), means you can fuzzy find the right file more quickly. - We’ve recommended several different naming conventions over the years, most recently calling these `schema.yml` files. We’ve simplified to recommend that these simply be labelled based on the YAML dictionary that they contain. - - If you utilize [doc blocks](https://docs.getdbt.com/docs/collaborate/documentation#using-docs-blocks) in your project, we recommend following the same pattern, and creating a `_[directory]__docs.md` markdown file per directory containing all your doc blocks for that folder of models. + - If you utilize [doc blocks](https://docs.getdbt.com/docs/build/documentation#using-docs-blocks) in your project, we recommend following the same pattern, and creating a `_[directory]__docs.md` markdown file per directory containing all your doc blocks for that folder of models. - ❌ **Config per project.** Some people put _all_ of their source and model YAML into one file. While you can technically do this, and while it certainly simplifies knowing what file the config you’re looking for will be in (as there is only one file), it makes it much harder to find specific configurations within that file. We recommend balancing those two concerns. - ⚠️ **Config per model.** On the other end of the spectrum, some people prefer to create one YAML file per model. This presents less of an issue than a single monolith file, as you can quickly search for files, know exactly where specific configurations exist, spot models without configs (and thus without tests) by looking at the file tree, and various other advantages. In our opinion, the extra files, tabs, and windows this requires creating, copying from, pasting to, closing, opening, and managing creates a somewhat slower development experience that outweighs the benefits. Defining config per directory is the most balanced approach for most projects, but if you have compelling reasons to use config per model, there are definitely some great projects that follow this paradigm. - ✅ **Cascade configs.** Leverage your `dbt_project.yml` to set default configurations at the directory level. Use the well-organized folder structure we’ve created thus far to define the baseline schemas and materializations, and use dbt’s cascading scope priority to define variations to this. For example, as below, define your marts to be materialized as tables by default, define separate schemas for our separate subfolders, and any models that need to use incremental materialization can be defined at the model level. diff --git a/website/docs/docs/collaborate/documentation.md b/website/docs/docs/build/documentation.md similarity index 68% rename from website/docs/docs/collaborate/documentation.md rename to website/docs/docs/build/documentation.md index 6771f88a8d4..00ae02918b2 100644 --- a/website/docs/docs/collaborate/documentation.md +++ b/website/docs/docs/build/documentation.md @@ -1,11 +1,12 @@ --- -title: "About documentation" +title: "Documentation" description: "Learn how good documentation for your dbt models helps stakeholders discover and understand your datasets." id: "documentation" -pagination_next: "docs/collaborate/build-and-view-your-docs" -pagination_prev: null --- +Good documentation for your dbt models will help downstream consumers discover and understand the datasets you curate for them. +dbt provides a way to generate documentation for your dbt project and render it as a website. + ## Related documentation * [Declaring properties](/reference/configs-and-properties) @@ -19,18 +20,12 @@ pagination_prev: null ## Overview -Good documentation for your dbt models will help downstream consumers discover and understand the datasets which you curate for them. - -dbt provides a way to generate documentation for your dbt project and render it as a website. The documentation for your project includes: +dbt provides a way to generate documentation for your dbt project. The documentation for your project includes: * **Information about your project**: including model code, a DAG of your project, any tests you've added to a column, and more. * **Information about your **: including column data types, and sizes. This information is generated by running queries against the information schema. Importantly, dbt also provides a way to add **descriptions** to models, columns, sources, and more, to further enhance your documentation. -Here's an example docs site: - - - ## Adding descriptions to your project To add descriptions to your project, use the `description:` key in the same files where you declare [tests](/docs/build/data-tests), like so: @@ -60,13 +55,19 @@ models: - ## Generating project documentation -You can generate a documentation site for your project (with or without descriptions) using the CLI. -First, run `dbt docs generate` — this command tells dbt to compile relevant information about your dbt project and warehouse into `manifest.json` and `catalog.json` files respectively. To see the documentation for all columns and not just columns described in your project, ensure that you have created the models with `dbt run` beforehand. +The default documentation experience in dbt Cloud is [dbt Explorer](/docs/collaborate/explore-projects), available on [Team or Enterprise plans](https://www.getdbt.com/pricing/). Use dbt Explorer to view your project's resources (such as models, tests, and metrics), its [metadata](/docs/collaborate/explore-projects#generate-metadata), and their lineage to gain a better understanding of its latest production state. -Then, run `dbt docs serve` to use these `.json` files to populate a local website. +dbt Cloud developer and dbt Core users can use [dbt Docs](/docs/collaborate/build-and-view-your-docs#dbt-docs), which generates basic documentation, but it doesn't offer the same speed, metadata, or visibility as dbt Explorer. + +Generate documentation for you project by following these steps: + +1. Run `dbt docs generate` — this command tells dbt to compile relevant information about your dbt project and warehouse into `manifest.json` and `catalog.json` files, respectively. +2. Ensure that you have created the models with `dbt run` to view the documentation for all columns, not just those described in your project. +3. Run `dbt docs serve` if you're developing locally to use these `.json` files to populate a local website. + +To view a resource, its metadata, and what commands are needed in dbt Explorer, refer to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) for more details. ## FAQs @@ -75,8 +76,7 @@ Then, run `dbt docs serve` to use these `.json` files to populate a local websit - -## Using Docs Blocks +## Using docs blocks ### Syntax To declare a docs block, use the jinja `docs` tag. Docs blocks can contain arbitrary markdown, but they must be uniquely named. Their names may contain uppercase and lowercase letters (A-Z, a-z), digits (0-9), and underscores (_), but can't start with a digit. @@ -128,9 +128,11 @@ models: In the resulting documentation, `'{{ doc("table_events") }}'` will be expanded to the markdown defined in the `table_events` docs block. + ## Setting a custom overview +*Currently available for dbt Docs only.* -The "overview" shown in the documentation website can be overridden by supplying your own docs block called `__overview__`. By default, dbt supplies an overview with helpful information about the docs site itself. Depending on your needs, it may be a good idea to override this docs block with specific information about your company style guide, links to reports, or information about who to contact for help. To override the default overview, create a docs block that looks like this: +The "overview" shown in the dbt Docs website can be overridden by supplying your own docs block called `__overview__`. By default, dbt supplies an overview with helpful information about the docs site itself. Depending on your needs, it may be a good idea to override this docs block with specific information about your company style guide, links to reports, or information about who to contact for help. To override the default overview, create a docs block that looks like this: @@ -148,6 +150,7 @@ as well as the repo for this project \[here](https://github.com/dbt-labs/mrr-pla ### Custom project-level overviews +*Currently available for dbt Docs only.* You can set different overviews for each dbt project/package included in your documentation site by creating a docs block named `__[project_name]__`. For example, in order to define @@ -174,13 +177,21 @@ up to page views and sessions. ## Navigating the documentation site -Using the docs interface, you can navigate to the documentation for a specific model. That might look something like this: + +Use [dbt Explorer](/docs/collaborate/explore-projects) for a richer documentation experience and more interactive experience for understanding your project's resources and lineage. Available on [Team or Enterprise plans](https://www.getdbt.com/pricing/). + +For additional details on how to explore your lineage and navigate your resources, refer to [dbt Explorer](/docs/collaborate/explore-projects). + + + + +If you're using the dbt Docs interface, you can navigate to the documentation for a specific model. That might look something like this: Here, you can see a representation of the project structure, a markdown description for a model, and a list of all of the columns (with documentation) in the model. -From a docs page, you can click the green button in the bottom-right corner of the webpage to expand a "mini-map" of your DAG. This pane (shown below) will display the immediate parents and children of the model that you're exploring. +From the dbt Docs page, you can click the green button in the bottom-right corner of the webpage to expand a "mini-map" of your DAG. This pane (shown below) will display the immediate parents and children of the model that you're exploring. @@ -188,17 +199,24 @@ In this example, the `fct_subscription_transactions` model only has one direct p + + ## Deploying the documentation site +With dbt Cloud, use [dbt Explorer](/docs/collaborate/explore-projects) automatically retrieves the metadata updates after each job run in the production or staging deployment environment so it always has the latest results for your project — meaning it's always automatically updated after each job run. + :::caution Security The `dbt docs serve` command is only intended for local/development hosting of the documentation site. Please use one of the methods listed below (or similar) to ensure that your documentation site is hosted securely! ::: +#### For dbt Docs users + dbt's documentation website was built to make it easy to host on the web. The site is "static,” meaning you don't need any "dynamic" servers to serve the docs. You can host your documentation in several ways: -* Use [dbt Cloud](/docs/collaborate/documentation) +* Use [dbt Cloud's](/docs/collaborate/build-and-view-your-docs) default documentation experience with [dbt Explorer](/docs/collaborate/explore-projects). * Host on [Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/dev/WebsiteHosting.html) (optionally [with IP access restrictions](https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html#example-bucket-policies-use-case-3)) * Publish with [Netlify](https://discourse.getdbt.com/t/publishing-dbt-docs-to-netlify/121) * Use your own web server like Apache/Nginx + diff --git a/website/docs/docs/build/exposures.md b/website/docs/docs/build/exposures.md index bcbe819d98c..0daf44b1c4c 100644 --- a/website/docs/docs/build/exposures.md +++ b/website/docs/docs/build/exposures.md @@ -6,7 +6,7 @@ id: "exposures" Exposures make it possible to define and describe a downstream use of your dbt project, such as in a dashboard, application, or data science pipeline. By defining exposures, you can then: - run, test, and list resources that feed into your exposure -- populate a dedicated page in the auto-generated [documentation](/docs/collaborate/documentation) site with context relevant to data consumers +- populate a dedicated page in the auto-generated [documentation](/docs/build/documentation) site with context relevant to data consumers ### Declaring an exposure diff --git a/website/docs/docs/build/projects.md b/website/docs/docs/build/projects.md index 45b623dc550..a65d4773ac6 100644 --- a/website/docs/docs/build/projects.md +++ b/website/docs/docs/build/projects.md @@ -16,7 +16,7 @@ At a minimum, all a project needs is the `dbt_project.yml` project configuration | [seeds](/docs/build/seeds) | CSV files with static data that you can load into your data platform with dbt. | | [data tests](/docs/build/data-tests) | SQL queries that you can write to test the models and resources in your project. | | [macros](/docs/build/jinja-macros) | Blocks of code that you can reuse multiple times. | -| [docs](/docs/collaborate/documentation) | Docs for your project that you can build. | +| [docs](/docs/build/documentation) | Docs for your project that you can build. | | [sources](/docs/build/sources) | A way to name and describe the data loaded into your warehouse by your Extract and Load tools. | | [exposures](/docs/build/exposures) | A way to define and describe a downstream use of your project. | | [metrics](/docs/build/build-metrics-intro) | A way for you to define metrics for your project. | diff --git a/website/docs/docs/build/sources.md b/website/docs/docs/build/sources.md index 466bcedc688..93757cdfa71 100644 --- a/website/docs/docs/build/sources.md +++ b/website/docs/docs/build/sources.md @@ -91,7 +91,7 @@ You can also: - Add data tests to sources - Add descriptions to sources, that get rendered as part of your documentation site -These should be familiar concepts if you've already added tests and descriptions to your models (if not check out the guides on [testing](/docs/build/data-tests) and [documentation](/docs/collaborate/documentation)). +These should be familiar concepts if you've already added tests and descriptions to your models (if not check out the guides on [testing](/docs/build/data-tests) and [documentation](/docs/build/documentation)). diff --git a/website/docs/docs/build/sql-models.md b/website/docs/docs/build/sql-models.md index 87e063cdcdb..a019508d370 100644 --- a/website/docs/docs/build/sql-models.md +++ b/website/docs/docs/build/sql-models.md @@ -260,7 +260,7 @@ Additionally, the `ref` function encourages you to write modular transformations ## Testing and documenting models -You can also document and test models — skip ahead to the section on [testing](/docs/build/data-tests) and [documentation](/docs/collaborate/documentation) for more information. +You can also document and test models — skip ahead to the section on [testing](/docs/build/data-tests) and [documentation](/docs/build/documentation) for more information. ## Additional FAQs diff --git a/website/docs/docs/cloud/dbt-assist.md b/website/docs/docs/cloud/dbt-assist.md index cac5457812a..eafe7d05821 100644 --- a/website/docs/docs/cloud/dbt-assist.md +++ b/website/docs/docs/cloud/dbt-assist.md @@ -8,7 +8,7 @@ pagination_prev: null # About dbt Assist -dbt Assist is a powerful artificial intelligence (AI) co-pilot feature that helps automate development in dbt Cloud, allowing you to focus on delivering data that works. dbt Assist’s AI co-pilot generates documentation and tests for your dbt SQL models directly in the dbt Cloud IDE, with a click of a button, and helps you accomplish more in less time. +dbt Assist is a powerful artificial intelligence (AI) co-pilot feature that helps automate development in dbt Cloud, allowing you to focus on delivering data that works. dbt Assist’s AI co-pilot generates [documentation](/docs/build/documentation) and [tests](/docs/build/data-tests) for your dbt SQL models directly in the dbt Cloud IDE, with a click of a button, and helps you accomplish more in less time. :::tip Beta feature dbt Assist is an AI tool meant to _help_ developers generate documentation and tests in dbt Cloud. It's available in beta, in the dbt Cloud IDE only. diff --git a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md index 1e561b379b4..e2fb122cba3 100644 --- a/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md +++ b/website/docs/docs/cloud/dbt-cloud-ide/develop-in-the-cloud.md @@ -131,7 +131,7 @@ Nice job, you're ready to start developing and building models 🎉! - **Generate your YAML configurations with dbt Assist** — [dbt Assist](/docs/cloud/dbt-assist) is a powerful artificial intelligence (AI) co-pilot feature that helps automate development in dbt Cloud. It generates documentation and tests for your dbt SQL models directly in the dbt Cloud IDE, with a click of a button, and helps you accomplish more in less time. Available for dbt Cloud Enterprise plans. -- **Build and view your project's docs** — The dbt Cloud IDE makes it possible to [build and view](/docs/collaborate/build-and-view-your-docs#generating-documentation) documentation for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production. +- **Build and view your project's docs** — The dbt Cloud IDE makes it possible to [build and view](/docs/collaborate/build-and-view-your-docs) documentation for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production. ## Related docs diff --git a/website/docs/docs/collaborate/build-and-view-your-docs.md b/website/docs/docs/collaborate/build-and-view-your-docs.md new file mode 100644 index 00000000000..ad43795a38c --- /dev/null +++ b/website/docs/docs/collaborate/build-and-view-your-docs.md @@ -0,0 +1,85 @@ +--- +title: "Build and view your docs with dbt Cloud" +id: "build-and-view-your-docs" +description: "Automatically generate project documentation as you run jobs." +pagination_next: null +--- + +dbt Cloud enables you to generate documentation for your project and data platform. The documentation is automatically updated with new information after a fully successful job run, ensuring accuracy and relevance. + +The default documentation experience in dbt Cloud is [dbt Explorer](/docs/collaborate/explore-projects), available on [Team or Enterprise plans](https://www.getdbt.com/pricing/). Use [dbt Explorer](/docs/collaborate/explore-projects) to view your project's resources (such as models, tests, and metrics) and their lineage to gain a better understanding of its latest production state. + +Refer to [documentation](/docs/build/documentation) for more configuration details. + +This shift makes [dbt Docs](#dbt-docs) a legacy documentation feature in dbt Cloud. dbt Docs is still accessible and offers basic documentation, but it doesn't offer the same speed, metadata, or visibility as dbt Explorer. dbt Docs is available to dbt Cloud developer plans or dbt Core users. + +## Set up a documentation job + +dbt Explorer uses the [metadata](/docs/collaborate/explore-projects#generate-metadata) generated after each job run in the production or staging environment, ensuring it always has the latest project results. To view richer metadata, you can set up documentation for a job in dbt Cloud when you edit your job settings or create a new job. + +Configure the job to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) when it runs. If you want to view column and statistics for models, sources, and snapshots in dbt Explorer, then this step is necessary. + +To set up a job to generate docs: + +1. In the top left, click **Deploy** and select **Jobs**. +2. Create a new job or select an existing job and click **Settings**. +3. Under **Execution Settings**, select **Generate docs on run** and click **Save**. + + +*Note, for dbt Docs users you need to configure the job to generate docs when it runs, then manually link that job to your project. Proceed to [configure project documentation](#configure-project-documentation) so your project generates the documentation when this job runs.* + +You can also add the [`dbt docs generate` command](/reference/commands/cmd-docs) to the list of commands in the job run steps. However, you can expect different outcomes when adding the command to the run steps compared to configuring a job selecting the **Generate docs on run** checkbox. + +Review the following options and outcomes: + +| Options | Outcomes | +|--------| ------- | +| **Select checkbox** | Select the **Generate docs on run** checkbox to automatically generate updated project docs each time your job runs. If that particular step in your job fails, the job can still be successful if all subsequent steps are successful. | +| **Add as a run step** | Add `dbt docs generate` to the list of commands in the job run steps, in whatever order you prefer. If that particular step in your job fails, the job will fail and all subsequent steps will be skipped. | + +:::tip Tip — Documentation-only jobs + +To create and schedule documentation-only jobs at the end of your production jobs, add the `dbt compile` command in the **Commands** section. + +::: + +## dbt Docs + +dbt Docs, available on developer plans or dbt Core users, generates a website from your dbt project using the `dbt docs generate` command. It provides a central location to view your project's resources, such as models, tests, and lineage — and helps you understand the data in your warehouse. + +### Configure project documentation + +You configure project documentation to generate documentation when the job you set up in the previous section runs. In the project settings, specify the job that generates documentation artifacts for that project. Once you configure this setting, subsequent runs of the job will automatically include a step to generate documentation. + +1. Click the gear icon in the top right. +2. Select **Account Settings**. +3. Navigate to **Projects** and select the project that needs documentation. +4. Click **Edit**. +5. Under **Artifacts**, select the job that should generate docs when it runs and click **Save**. + + +:::tip Use dbt Explorer for a richer documentation experience +For a richer and more interactive experience, try out [dbt Explorer](/docs/collaborate/explore-projects), available on [Team or Enterprise plans](https://www.getdbt.com/pricing/). It includes map layers of your DAG, keyword search, interacts with the IDE, model performance, project recommendations, and more. +::: + +### Generating documentation + +To generate documentation in the dbt Cloud IDE, run the `dbt docs generate` command in the **Command Bar** in the dbt Cloud IDE. This command will generate the documentation for your dbt project as it exists in development in your IDE session. + +After generating your documentation, you can click **Explore** in the navigation. This will take you to dbt Explorer, where you can view your project's resources and their lineage. + + + +After running `dbt docs generate` in the dbt Cloud IDE, click the icon above the file tree, to see the latest version of your documentation rendered in a new browser window. + +### View documentation + +Once you set up a job to generate documentation for your project, you can click **Explore** in the navigation and then click on **dbt Docs**. Your project's documentation should open. This link will always help you find the most recent version of your project's documentation in dbt Cloud. + +These generated docs always show the last fully successful run, which means that if you have any failed tasks, including tests, then you will not see changes to the docs by this run. If you don't see a fully successful run, then you won't see any changes to the documentation. + +The dbt Cloud IDE makes it possible to view [documentation](/docs/build/documentation) for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production. + +## Related docs +- [Documentation](/docs/build/documentation) +- [dbt Explorer](/docs/collaborate/explore-projects) diff --git a/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md b/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md deleted file mode 100644 index 0129b43f305..00000000000 --- a/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md +++ /dev/null @@ -1,68 +0,0 @@ ---- -title: "Build and view your docs with dbt Cloud" -id: "build-and-view-your-docs" -description: "Automatically generate project documentation as you run jobs." -pagination_next: null ---- - -dbt Cloud enables you to generate documentation for your project and data platform, rendering it as a website. The documentation is only updated with new information after a fully successful job run, ensuring accuracy and relevance. Refer to [Documentation](/docs/collaborate/documentation) for more details. - -## Set up a documentation job - -You can set up documentation for a job in dbt Cloud when you edit your job settings or create a new job. You need to configure the job to generate docs when it runs, then link that job to your project. - -To set up a job to generate docs: - -1. In the top left, click **Deploy** and select **Jobs**. -2. Create a new job or select an existing job and click **Settings**. -3. Under "Execution Settings," select **Generate docs on run**. - - -4. Click **Save**. Proceed to [configure project documentation](#configure-project-documentation) so your project generates the documentation when this job runs. - -You can also add `dbt docs generate` to the list of commands in the job run steps. However, you can expect different outcomes when adding the command to the run steps compared to configuring a job selecting the **Generate docs on run** checkbox (shown in previous steps). - -Review the following options and outcomes: - -| Options | Outcomes | -|--------| ------- | -| **Select checkbox** | Select the **Generate docs on run** checkbox to automatically generate updated project docs each time your job runs. If that particular step in your job fails, the job can still be successful if all subsequent steps are successful. | -| **Add as a run step** | Add `dbt docs generate` to the list of commands in the job run steps, in whatever order you prefer. If that particular step in your job fails, the job will fail and all subsequent steps will be skipped. | - -:::tip Tip — Documentation-only jobs - -To create and schedule documentation-only jobs at the end of your production jobs, add the `dbt compile` command in the **Commands** section. - -::: - -## Configure project documentation - -You configure project documentation to generate documentation when the job you set up in the previous section runs. In the project settings, specify the job that generates documentation artifacts for that project. Once you configure this setting, subsequent runs of the job will automatically include a step to generate documentation. - -1. Click the gear icon in the top right. -2. Select **Account Settings**. -3. Navigate to **Projects** and select the project that needs documentation. -4. Click **Edit**. -5. Under **Artifacts**, select the job that should generate docs when it runs. - -6. Click **Save**. - -## Generating documentation - -To generate documentation in the dbt Cloud IDE, run the `dbt docs generate` command in the -Command Bar in the dbt Cloud IDE. This command will generate the Docs for your dbt project as it exists in development in your IDE session. - - - -After generating your documentation, you can click the **Book** icon above the file tree, to see the latest version of your documentation rendered in a new browser window. - -## Viewing documentation - -Once you set up a job to generate documentation for your project, you can click **Documentation** in the top left. Your project's documentation should open. This link will always help you find the most recent version of your project's documentation in dbt Cloud. - -These generated docs always show the last fully successful run, which means that if you have any failed tasks, including tests, then you will not see changes to the docs by this run. If you don't see a fully successful run, then you won't see any changes to the documentation. - -The dbt Cloud IDE makes it possible to view [documentation](/docs/collaborate/documentation) -for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production. - - diff --git a/website/docs/docs/collaborate/collaborate-with-others.md b/website/docs/docs/collaborate/collaborate-with-others.md index 7875a8044b6..c8c8bd4657f 100644 --- a/website/docs/docs/collaborate/collaborate-with-others.md +++ b/website/docs/docs/collaborate/collaborate-with-others.md @@ -8,7 +8,7 @@ pagination_prev: null
@@ -26,7 +26,7 @@ pagination_prev: null -
\ No newline at end of file + diff --git a/website/docs/docs/collaborate/explore-projects.md b/website/docs/docs/collaborate/explore-projects.md index a92d5a69ad1..aa549520f34 100644 --- a/website/docs/docs/collaborate/explore-projects.md +++ b/website/docs/docs/collaborate/explore-projects.md @@ -1,8 +1,8 @@ --- -title: "Explore your dbt projects" -sidebar_label: "Explore dbt projects" -description: "Learn about dbt Explorer and how to interact with it to understand, improve, and leverage your data pipelines." -pagination_next: "docs/collaborate/model-performance" +title: "Discover data with dbt Explorer" +sidebar_label: "Discover data with dbt Explorer" +description: "Learn about dbt Explorer and how to interact with it to understand, improve, and leverage your dbt projects." +pagination_next: "docs/collaborate/column-level-lineage" pagination_prev: null --- @@ -12,28 +12,30 @@ With dbt Explorer, you can view your project's [resources](/docs/build/projects) - You have a dbt Cloud account on the [Team or Enterprise plan](https://www.getdbt.com/pricing/). - You have set up a [production](/docs/deploy/deploy-environments#set-as-production-environment) or [staging](/docs/deploy/deploy-environments#create-a-staging-environment) deployment environment for each project you want to explore. - - There has been at least one successful job run in the deployment environment. Note that [CI jobs](/docs/deploy/ci-jobs) do not update dbt Explorer. -- You are on the dbt Explorer page. To do this, select **Explore** from the top navigation bar in dbt Cloud. +- You have at least one successful job run in the deployment environment. Note that [CI jobs](/docs/deploy/ci-jobs) do not update dbt Explorer. +- You are on the dbt Explorer page. To do this, select **Explore** from the navigation in dbt Cloud. + -## Generate metadata +## Generate metadata -dbt Explorer uses the metadata provided by the [Discovery API](/docs/dbt-cloud-apis/discovery-api) to display the details about [the state of your project](/docs/dbt-cloud-apis/project-state). The metadata that's available depends on the [deployment environment](/docs/deploy/deploy-environments) you've designated as _production_ or _staging_ in your dbt Cloud project. dbt Explorer automatically retrieves the metadata updates after each job run in the production or staging deployment environment so it always has the latest results for your project. +dbt Explorer uses the metadata provided by the [Discovery API](/docs/dbt-cloud-apis/discovery-api) to display the details about [the state of your project](/docs/dbt-cloud-apis/project-state). The metadata that's available depends on the [deployment environment](/docs/deploy/deploy-environments) you've designated as _production_ or _staging_ in your dbt Cloud project. -Note that CI jobs do not update dbt Explorer. This is because they don't reflect the production state and don't provide the necessary metadata updates. - -To view a resource and its metadata, you must define the resource in your project and run a job in the production or staging environment. The resulting metadata depends on the [commands](/docs/deploy/job-commands) executed by the jobs. +- dbt Explorer automatically retrieves the metadata updates after each job run in the production or staging deployment environment so it always has the latest results for your project. This includes deploy and merge jobs. +- Note that CI jobs do not update dbt Explorer. This is because they don't reflect the production state and don't provide the necessary metadata updates. +- To view a resource and its metadata, you must define the resource in your project and run a job in the production or staging environment. +- The resulting metadata depends on the [commands](/docs/deploy/job-commands) executed by the jobs. | To view in Explorer | You must successfully run | |---------------------|---------------------------| | Model lineage, details, or results | [dbt run](/reference/commands/run) or [dbt build](/reference/commands/build) on a given model within a job in the environment | -| Columns and statistics for models, sources, and snapshots| [dbt docs generate](/reference/commands/cmd-docs) within a job in the environment | +| Columns and statistics for models, sources, and snapshots| [dbt docs generate](/reference/commands/cmd-docs) within [a job](/docs/collaborate/build-and-view-your-docs) in the environment | | Test results | [dbt test](/reference/commands/test) or [dbt build](/reference/commands/build) within a job in the environment | | Source freshness results | [dbt source freshness](/reference/commands/source#dbt-source-freshness) within a job in the environment | | Snapshot details | [dbt snapshot](/reference/commands/snapshot) or [dbt build](/reference/commands/build) within a job in the environment | | Seed details | [dbt seed](/reference/commands/seed) or [dbt build](/reference/commands/build) within a job in the environment | -Richer and more timely metadata will become available as dbt Cloud evolves. +Richer and more timely metadata will become available as dbt Cloud evolves. ## Explore your project's lineage graph {#project-lineage} @@ -184,7 +186,7 @@ In the upper right corner of the resource details page, you can: - **Status bar** (below the page title) — Information on the last time the model ran, whether the run was successful, how the data is materialized, number of rows, and the size of the model. - **General** tab includes: - **Lineage** graph — The model’s lineage graph that you can interact with. The graph includes one upstream node and one downstream node from the model. Click the Expand icon in the graph's upper right corner to view the model in full lineage graph mode. - - **Description** section — A [description of the model](/docs/collaborate/documentation#adding-descriptions-to-your-project). + - **Description** section — A [description of the model](/docs/build/documentation#adding-descriptions-to-your-project). - **Recent** section — Information on the last time the model ran, how long it ran for, whether the run was successful, the job ID, and the run ID. - **Tests** section — [Tests](/docs/build/data-tests) for the model, including a status indicator for the latest test status. A :white_check_mark: denotes a passing test. - **Details** section — Key properties like the model’s relation name (for example, how it’s represented and how you can query it in the data platform: `database.schema.identifier`); model governance attributes like access, group, and if contracted; and more. diff --git a/website/docs/docs/dbt-cloud-apis/discovery-api.md b/website/docs/docs/dbt-cloud-apis/discovery-api.md index 438cf431060..0345c647dd9 100644 --- a/website/docs/docs/dbt-cloud-apis/discovery-api.md +++ b/website/docs/docs/dbt-cloud-apis/discovery-api.md @@ -50,7 +50,8 @@ Use the API to find and understand dbt assets in integrated tools using informat Data producers must manage and organize data for stakeholders, while data consumers need to quickly and confidently analyze data on a large scale to make informed decisions that improve business outcomes and reduce organizational overhead. The API is useful for discovery data experiences in catalogs, analytics, apps, and machine learning (ML) tools. It can help you understand the origin and meaning of datasets for your analysis. - + + diff --git a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md index f14fd03a534..38cc7c69b6a 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/11-Older versions/upgrading-to-0-18-0.md @@ -69,7 +69,7 @@ can override schema test definitions - [`full_refresh` config](/reference/resource-configs/full_refresh) **Docs** -- [project-level overviews](/docs/collaborate/documentation#custom-project-level-overviews) +- [project-level overviews](/docs/build/documentation#custom-project-level-overviews) **Redshift** - [`iam_profile`](/docs/core/connect-data-platform/redshift-setup#specifying-an-iam-profile) diff --git a/website/docs/docs/deploy/artifacts.md b/website/docs/docs/deploy/artifacts.md index 9b3ae71e79c..cff36bfafba 100644 --- a/website/docs/docs/deploy/artifacts.md +++ b/website/docs/docs/deploy/artifacts.md @@ -4,13 +4,23 @@ id: "artifacts" description: "Use artifacts to power your automated docs site and source freshness data." --- -When running dbt jobs, dbt Cloud generates and saves *artifacts*. You can use these artifacts, like `manifest.json`, `catalog.json`, and `sources.json` to power different aspects of dbt Cloud, namely: [dbt Docs](/docs/collaborate/documentation) and [source freshness reporting](/docs/build/sources#snapshotting-source-data-freshness). +When running dbt jobs, dbt Cloud generates and saves *artifacts*. You can use these artifacts, like `manifest.json`, `catalog.json`, and `sources.json` to power different aspects of dbt Cloud, namely: [dbt Explorer](/docs/collaborate/explore-projects), [dbt Docs](/docs/collaborate/build-and-view-your-docs#dbt-docs), and [source freshness reporting](/docs/build/sources#snapshotting-source-data-freshness). ## Create dbt Cloud Artifacts -While running any job can produce artifacts, you should only associate one production job with a given project to produce the project's artifacts. You can designate this connection in the **Project details** page. To access this page, click the gear icon in the upper right, select **Account Settings**, select your project, and click **Edit** in the lower right. Under **Artifacts**, select the jobs you want to produce documentation and source freshness artifacts for. +[dbt Explorer](/docs/collaborate/explore-projects#generate-metadata) uses the metadata provided by the [Discovery API](/docs/dbt-cloud-apis/discovery-api) to display the details about [the state of your project](/docs/dbt-cloud-apis/project-state). It uses metadata from your staging and production [deployment environments](/docs/deploy/deploy-environments) (development environment metadata is coming soon). - +dbt Explorer automatically retrieves the metadata updates after each job run in the production or staging deployment environment so it always has the latest results for your project — meaning it's always automatically updated after each job run. + +To view a resource, its metadata, and what commands are needed, refer to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) for more details. + + + +The following steps are for legacy dbt Docs only. For the current documentation experience, see [dbt Explorer](/docs/collaborate/explore-projects). + +While running any job can produce artifacts, you should only associate one production job with a given project to produce the project's artifacts. You can designate this connection on the **Project details** page. To access this page, click the gear icon in the upper right, select **Account Settings**, select your project, and click **Edit** in the lower right. Under **Artifacts**, select the jobs you want to produce documentation and source freshness artifacts for. + + If you don't see your job listed, you might need to edit the job and select **Run source freshness** and **Generate docs on run**. @@ -18,17 +28,30 @@ If you don't see your job listed, you might need to edit the job and select **Ru When you add a production job to a project, dbt Cloud updates the content and provides links to the production documentation and source freshness artifacts it generated for that project. You can see these links by clicking **Deploy** in the upper left, selecting **Jobs**, and then selecting the production job. From the job page, you can select a specific run to see how artifacts were updated for that run only. + + ### Documentation -When set up, dbt Cloud updates the **Documentation** link in the header tab so it links to documentation for this job. This link always directs you to the latest version of the documentation for your project. +Navigate to [dbt Explorer](/docs/collaborate/explore-projects) through the **Explore** link to view your project's resources and lineage to gain a better understanding of its latest production state. -Note that both the job's commands and the docs generate step (triggered by the **Generate docs on run** checkbox) must succeed during the job invocation for the project-level documentation to be populated or updated. +To view a resource, its metadata, and what commands are needed, refer to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) for more details. +Both the job's commands and the docs generate step (triggered by the **Generate docs on run** checkbox) must succeed during the job invocation to update the documentation. - + + +When set up, dbt Cloud updates the Documentation link in the header tab so it links to documentation for this job. This link always directs you to the latest version of the documentation for your project. + + ### Source Freshness -As with Documentation, configuring a job for the Source Freshness artifact setting also updates the Data Sources link under **Deploy**. The new link points to the latest Source Freshness report for the selected job. +To view the latest source freshness result, refer to [generate metadata](/docs/collaborate/explore-projects#generate-metadata) for more detail. Then navigate to dbt Explorer through the **Explore** link. + + + +Configuring a job for the Source Freshness artifact setting also updates the data source link under **Deploy**. The new link points to the latest Source Freshness report for the selected job. + + diff --git a/website/docs/docs/deploy/job-commands.md b/website/docs/docs/deploy/job-commands.md index 26fe1931db6..8117178b2d6 100644 --- a/website/docs/docs/deploy/job-commands.md +++ b/website/docs/docs/deploy/job-commands.md @@ -35,7 +35,7 @@ Every job invocation automatically includes the [`dbt deps`](/reference/commands For every job, you have the option to select the [Generate docs on run](/docs/collaborate/build-and-view-your-docs) or [Run source freshness](/docs/deploy/source-freshness) checkboxes, enabling you to run the commands automatically. -**Job outcome Generate docs on run checkbox** — dbt Cloud executes the `dbt docs generate` command, _after_ the listed commands. If that particular run step in your job fails, the job can still succeed if all subsequent run steps are successful. Read [Build and view your docs](/docs/collaborate/build-and-view-your-docs) for more info. +**Job outcome Generate docs on run checkbox** — dbt Cloud executes the `dbt docs generate` command, _after_ the listed commands. If that particular run step in your job fails, the job can still succeed if all subsequent run steps are successful. Read [Set up documentation job](/docs/collaborate/build-and-view-your-docs) for more info. **Job outcome Source freshness checkbox** — dbt Cloud executes the `dbt source freshness` command as the first run step in your job. If that particular run step in your job fails, the job can still succeed if all subsequent run steps are successful. Read [Source freshness](/docs/deploy/source-freshness) for more info. diff --git a/website/docs/docs/deploy/source-freshness.md b/website/docs/docs/deploy/source-freshness.md index ab267b6d067..a409c01f82c 100644 --- a/website/docs/docs/deploy/source-freshness.md +++ b/website/docs/docs/deploy/source-freshness.md @@ -12,7 +12,7 @@ dbt Cloud provides a helpful interface around dbt's [source data freshness](/doc [`dbt build`](reference/commands/build) does _not_ include source freshness checks when building and testing resources in your DAG. Instead, you can use one of these common patterns for defining jobs: - Add `dbt build` to the run step to run models, tests, and so on. -- Select the **Generate docs on run** checkbox to automatically [generate project docs](/docs/collaborate/build-and-view-your-docs#set-up-a-documentation-job). +- Select the **Generate docs on run** checkbox to automatically [generate project docs](/docs/collaborate/build-and-view-your-docs). - Select the **Run source freshness** checkbox to enable [source freshness](#checkbox) as the first step of the job. @@ -42,4 +42,4 @@ It's important that your freshness jobs run frequently enough to snapshot data l ## Further reading - Refer to [Artifacts](/docs/deploy/artifacts) for more info on how to create dbt Cloud artifacts, share links to the latest documentation, and share source freshness reports with your team. -- Source freshness for Snowflake is calculated using the `LAST_ALTERED` column. Read about the limitations in [Snowflake configs](/reference/resource-configs/snowflake-configs#source-freshness-known-limitation). \ No newline at end of file +- Source freshness for Snowflake is calculated using the `LAST_ALTERED` column. Read about the limitations in [Snowflake configs](/reference/resource-configs/snowflake-configs#source-freshness-known-limitation). diff --git a/website/docs/docs/introduction.md b/website/docs/docs/introduction.md index 980915a2c42..5301dae396d 100644 --- a/website/docs/docs/introduction.md +++ b/website/docs/docs/introduction.md @@ -61,7 +61,7 @@ As a dbt user, your main focus will be on writing models (select queries) that r | Handle boilerplate code to materialize queries as relations | For each model you create, you can easily configure a *materialization*. A materialization represents a build strategy for your select query – the code behind a materialization is robust, boilerplate SQL that wraps your select query in a statement to create a new, or update an existing, relation. Read more about [Materializations](/docs/build/materializations).| | Use a code compiler | SQL files can contain Jinja, a lightweight templating language. Using Jinja in SQL provides a way to use control structures in your queries. For example, `if` statements and `for` loops. It also enables repeated SQL to be shared through `macros`. Read more about [Macros](/docs/build/jinja-macros).| | Determine the order of model execution | Often, when transforming data, it makes sense to do so in a staged approach. dbt provides a mechanism to implement transformations in stages through the [ref function](/reference/dbt-jinja-functions/ref). Rather than selecting from existing tables and views in your warehouse, you can select from another model.| -| Document your dbt project | In dbt Cloud, you can auto-generate the documentation when your dbt project runs. dbt provides a mechanism to write, version-control, and share documentation for your dbt models. You can write descriptions (in plain text or markdown) for each model and field. Read more about the [Documentation](/docs/collaborate/documentation).| +| Document your dbt project | In dbt Cloud, you can auto-generate the documentation when your dbt project runs. dbt provides a mechanism to write, version-control, and share documentation for your dbt models. You can write descriptions (in plain text or markdown) for each model and field. Read more about the [Documentation](/docs/build/documentation).| | Test your models | Tests provide a way to improve the integrity of the SQL in each model by making assertions about the results generated by a model. Build, test, and run your project with a button click or by using the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) command bar. Read more about writing tests for your models [Testing](/docs/build/data-tests)| | Manage packages | dbt ships with a package manager, which allows analysts to use and publish both public and private repositories of dbt code which can then be referenced by others. Read more about [Package Management](/docs/build/packages). | | Load seed files| Often in analytics, raw values need to be mapped to a more readable value (for example, converting a country-code to a country name) or enriched with static or infrequently changing data. These data sources, known as seed files, can be saved as a CSV file in your `project` and loaded into your data warehouse using the `seed` command. Read more about [Seeds](/docs/build/seeds).| diff --git a/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md b/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md index f1e631f0d78..9e254de92d8 100644 --- a/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md +++ b/website/docs/docs/running-a-dbt-project/run-your-dbt-projects.md @@ -8,7 +8,7 @@ You can run your dbt projects with [dbt Cloud](/docs/cloud/about-cloud/dbt-cloud - **dbt Cloud**: A hosted application where you can develop directly from a web browser using the [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud). It also natively supports developing using a command line interface, [dbt Cloud CLI](/docs/cloud/cloud-cli-installation). Among other features, dbt Cloud provides: - Development environment to help you build, test, run, and [version control](/docs/collaborate/git-version-control) your project faster. - - Share your [dbt project's documentation](/docs/collaborate/build-and-view-your-docs) with your team. + - Share your [dbt project's documentation](/docs/build/documentation) with your team. - Integrates with the dbt Cloud IDE, allowing you to run development tasks and environment in the dbt Cloud UI for a seamless experience. - The dbt Cloud CLI to develop and run dbt commands against your dbt Cloud development environment from your local command line. - For more details, refer to [Develop dbt](/docs/cloud/about-develop-dbt). diff --git a/website/docs/faqs/Docs/_category_.yaml b/website/docs/faqs/Docs/_category_.yaml index 8c7925dcc15..0a9aa44fe56 100644 --- a/website/docs/faqs/Docs/_category_.yaml +++ b/website/docs/faqs/Docs/_category_.yaml @@ -1,10 +1,10 @@ # position: 2.5 # float position is supported -label: 'dbt Docs' +label: 'Documentation' collapsible: true # make the category collapsible collapsed: true # keep the category collapsed by default className: red link: type: generated-index - title: dbt Docs FAQs + title: Documentation FAQs customProps: - description: Frequently asked questions about dbt Docs + description: Frequently asked questions about documentation diff --git a/website/docs/faqs/Docs/long-descriptions.md b/website/docs/faqs/Docs/long-descriptions.md index cdf15a94120..ef410df0517 100644 --- a/website/docs/faqs/Docs/long-descriptions.md +++ b/website/docs/faqs/Docs/long-descriptions.md @@ -31,4 +31,5 @@ If you need more than a sentence to explain a model, you can: * tempor incididunt ut labore et dolore magna aliqua. ``` -3. Use a [docs block](/docs/collaborate/documentation#using-docs-blocks) to write the description in a separate Markdown file. +3. Use a [docs block](/docs/build/documentation#using-docs-blocks) to write the description in a separate Markdown file. +b diff --git a/website/docs/faqs/Docs/sharing-documentation.md b/website/docs/faqs/Docs/sharing-documentation.md index 4c6e0e84f77..cff618586ea 100644 --- a/website/docs/faqs/Docs/sharing-documentation.md +++ b/website/docs/faqs/Docs/sharing-documentation.md @@ -1,8 +1,12 @@ --- -title: How do I share my documentation with my team members? +title: How do I access documentation in dbt Explorer? description: "Use read-only seats to share documentation" -sidebar_label: 'Share documentation with teammates' +sidebar_label: 'Access documentation in dbt Explorer' id: sharing-documentation --- -If you're using dbt Cloud to deploy your project, and have the [Team plan](https://www.getdbt.com/pricing/), you can have up to 5 read-only users, who will be able access the documentation for your project. +If you're using dbt Cloud to deploy your project and have the [Team or Enterprise plan](https://www.getdbt.com/pricing/), you can use dbt Explorer to view your project's [resources](/docs/build/projects) (such as models, tests, and metrics) and their lineage to gain a better understanding of its latest production state. + +Access dbt Explorer in dbt Cloud by clicking the **Explore** link in the navigation. You can have up to 5 read-only users access the documentation for your project. + +dbt Cloud developer plan and dbt Core users can use [dbt Docs](/docs/collaborate/build-and-view-your-docs#dbt-docs), which generates basic documentation but it doesn't offer the same speed, metadata, or visibility as dbt Explorer. diff --git a/website/docs/guides/building-packages.md b/website/docs/guides/building-packages.md index cc1ee2f1d74..69f963049ad 100644 --- a/website/docs/guides/building-packages.md +++ b/website/docs/guides/building-packages.md @@ -108,7 +108,7 @@ The major exception to this is when working with data sources that benefit from ### Test and document your package It's critical that you [test](/docs/build/data-tests) your models and sources. This will give your end users confidence that your package is actually working on top of their dataset as intended. -Further, adding [documentation](/docs/collaborate/documentation) via descriptions will help communicate your package to end users, and benefit their stakeholders that use the outputs of this package. +Further, adding [documentation](/docs/build/documentation) via descriptions will help communicate your package to end users, and benefit their stakeholders that use the outputs of this package. ### Include useful GitHub artifacts Over time, we've developed a set of useful GitHub artifacts that make administering our packages easier for us. In particular, we ensure that we include: - A useful README, that has: @@ -172,4 +172,4 @@ The release notes should contain an overview of the changes introduced in the ne Our package registry, [hub.getdbt.com](https://hub.getdbt.com/), gets updated by the [hubcap script](https://github.com/dbt-labs/hubcap). To add your package to hub.getdbt.com, create a PR on the [hubcap repository](https://github.com/dbt-labs/hubcap) to include it in the `hub.json` file. - \ No newline at end of file + diff --git a/website/docs/guides/core-cloud-2.md b/website/docs/guides/core-cloud-2.md index fe9c7c60141..335b164d988 100644 --- a/website/docs/guides/core-cloud-2.md +++ b/website/docs/guides/core-cloud-2.md @@ -143,7 +143,7 @@ Once you’ve confirmed that dbt Cloud orchestration and CI/CD are working as ex Familiarize your team with dbt Cloud's [features](/docs/cloud/about-cloud/dbt-cloud-features) and optimize development and deployment processes. Some key features to consider include: - **Version management:** Manage [dbt versions](/docs/dbt-versions/upgrade-dbt-version-in-cloud) and ensure team collaboration with dbt Cloud's one-click feature, removing the hassle of manual updates and version discrepancies. You can go versionless by opting to **[Keep on latest version](/docs/dbt-versions/upgrade-dbt-version-in-cloud#keep-on-latest-version)** to always get the latest features and early access to new functionality for your dbt project. - **Development tools**: Use the [dbt Cloud CLI](/docs/cloud/cloud-cli-installation) or [dbt Cloud IDE](/docs/cloud/dbt-cloud-ide/develop-in-the-cloud) to build, test, run, and version control your dbt projects. -- **Documentation and Source freshness:** Automate storage of [documentation](/docs/collaborate/documentation) and track [source freshness](/docs/deploy/source-freshness) in dbt Cloud, which streamlines project maintenance. +- **Documentation and Source freshness:** Automate storage of [documentation](/docs/build/documentation) and track [source freshness](/docs/deploy/source-freshness) in dbt Cloud, which streamlines project maintenance. - **Notifications and logs:** Receive immediate [notifications](/docs/deploy/monitor-jobs) for job failures, with direct links to the job details. Access comprehensive logs for all job runs to help with troubleshooting. - **CI/CD:** Use dbt Cloud's [CI/CD](/docs/deploy/ci-jobs) feature to run your dbt projects in a temporary schema whenever new commits are pushed to open pull requests. This helps with catching bugs before deploying to production. diff --git a/website/docs/guides/dbt-python-snowpark.md b/website/docs/guides/dbt-python-snowpark.md index f6d54ee738f..8125f98d231 100644 --- a/website/docs/guides/dbt-python-snowpark.md +++ b/website/docs/guides/dbt-python-snowpark.md @@ -1858,7 +1858,7 @@ We are going to revisit 2 areas of our project to understand our documentation: - `intermediate.md` file - `dbt_project.yml` file -To start, let’s look back at our `intermediate.md` file. We can see that we provided multi-line descriptions for the models in our intermediate models using [docs blocks](/docs/collaborate/documentation#using-docs-blocks). Then we reference these docs blocks in our `.yml` file. Building descriptions with doc blocks in Markdown files gives you the ability to format your descriptions with Markdown and are particularly helpful when building long descriptions, either at the column or model level. In our `dbt_project.yml`, we added `node_colors` at folder levels. +To start, let’s look back at our `intermediate.md` file. We can see that we provided multi-line descriptions for the models in our intermediate models using [docs blocks](/docs/build/documentation#using-docs-blocks). Then we reference these docs blocks in our `.yml` file. Building descriptions with doc blocks in Markdown files gives you the ability to format your descriptions with Markdown and are particularly helpful when building long descriptions, either at the column or model level. In our `dbt_project.yml`, we added `node_colors` at folder levels. 1. To see all these pieces come together, execute this in the command bar: @@ -1926,4 +1926,4 @@ Fantastic! You’ve finished the workshop! We hope you feel empowered in using b For more help and information join our [dbt community Slack](https://www.getdbt.com/community/) which contains more than 50,000 data practitioners today. We have a dedicated slack channel #db-snowflake to Snowflake related content. Happy dbt'ing! - \ No newline at end of file + diff --git a/website/docs/guides/productionize-your-dbt-databricks-project.md b/website/docs/guides/productionize-your-dbt-databricks-project.md index 33f25070bdb..bada787e01f 100644 --- a/website/docs/guides/productionize-your-dbt-databricks-project.md +++ b/website/docs/guides/productionize-your-dbt-databricks-project.md @@ -197,4 +197,4 @@ To get the most out of both tools, you can use the [persist docs config](/refere - [Databricks + dbt Cloud Quickstart Guide](/guides/databricks) - Reach out to your Databricks account team to get access to preview features on Databricks. - \ No newline at end of file + diff --git a/website/docs/reference/artifacts/catalog-json.md b/website/docs/reference/artifacts/catalog-json.md index 44a3f980c60..54f0c93da90 100644 --- a/website/docs/reference/artifacts/catalog-json.md +++ b/website/docs/reference/artifacts/catalog-json.md @@ -7,7 +7,7 @@ sidebar_label: "Catalog" **Produced by:** [`docs generate`](/reference/commands/cmd-docs) -This file contains information from your about the tables and views produced and defined by the resources in your project. Today, dbt uses this file to populate metadata, such as column types and statistics, in the [docs site](/docs/collaborate/documentation). +This file contains information from your about the tables and views produced and defined by the resources in your project. Today, dbt uses this file to populate metadata, such as column types and statistics, in the [docs site](/docs/collaborate/build-and-view-your-docs). ### Top-level keys diff --git a/website/docs/reference/artifacts/dbt-artifacts.md b/website/docs/reference/artifacts/dbt-artifacts.md index 5e801d31b16..8d3e1ae29e8 100644 --- a/website/docs/reference/artifacts/dbt-artifacts.md +++ b/website/docs/reference/artifacts/dbt-artifacts.md @@ -5,7 +5,7 @@ sidebar_label: "About dbt artifacts" With every invocation, dbt generates and saves one or more *artifacts*. Several of these are files (`semantic_manifest.json`, `manifest.json`, `catalog.json`, `run_results.json`, and `sources.json`) that are used to power: -- [documentation](/docs/collaborate/documentation) +- [documentation](/docs/collaborate/build-and-view-your-docs) - [state](/reference/node-selection/syntax#about-node-selection) - [visualizing source freshness](/docs/build/sources#snapshotting-source-data-freshness) diff --git a/website/docs/reference/artifacts/manifest-json.md b/website/docs/reference/artifacts/manifest-json.md index 5a487f2f177..296b5250d5d 100644 --- a/website/docs/reference/artifacts/manifest-json.md +++ b/website/docs/reference/artifacts/manifest-json.md @@ -11,7 +11,7 @@ import ManifestVersions from '/snippets/_manifest-versions.md'; This single file contains a full representation of your dbt project's resources (models, tests, macros, etc), including all node configurations and resource properties. Even if you're only running some models or tests, all resources will appear in the manifest (unless they are disabled) with most of their properties. (A few node properties, such as `compiled_sql`, only appear for executed nodes.) -Today, dbt uses this file to populate the [docs site](/docs/collaborate/documentation), and to perform [state comparison](/reference/node-selection/syntax#about-node-selection). Members of the community have used this file to run checks on how many models have descriptions and tests. +Today, dbt uses this file to populate the [docs site](/docs/collaborate/build-and-view-your-docs), and to perform [state comparison](/reference/node-selection/syntax#about-node-selection). Members of the community have used this file to run checks on how many models have descriptions and tests. ### Top-level keys diff --git a/website/docs/reference/artifacts/other-artifacts.md b/website/docs/reference/artifacts/other-artifacts.md index c4e595782fc..75a4653d685 100644 --- a/website/docs/reference/artifacts/other-artifacts.md +++ b/website/docs/reference/artifacts/other-artifacts.md @@ -7,7 +7,7 @@ sidebar_label: "Other artifacts" **Produced by:** [`docs generate`](/reference/commands/cmd-docs) -This file is the skeleton of the [auto-generated dbt documentation website](/docs/collaborate/documentation). The contents of the site are populated by the [manifest](/reference/artifacts/manifest-json) and [catalog](catalog-json). +This file is the skeleton of the [auto-generated dbt documentation website](/docs/collaborate/build-and-view-your-docs). The contents of the site are populated by the [manifest](/reference/artifacts/manifest-json) and [catalog](catalog-json). Note: the source code for `index.json` comes from the [dbt-docs repo](https://github.com/dbt-labs/dbt-docs). Head over there if you want to make a bug report, suggestion, or contribution relating to the documentation site. diff --git a/website/docs/reference/commands/cmd-docs.md b/website/docs/reference/commands/cmd-docs.md index 176bd4106cd..60b3049ccf2 100644 --- a/website/docs/reference/commands/cmd-docs.md +++ b/website/docs/reference/commands/cmd-docs.md @@ -42,7 +42,7 @@ dbt docs generate --no-compile Use the `--empty-catalog` argument to skip running the database queries to populate `catalog.json`. When this flag is provided, `dbt docs generate` will skip step (3) described above. -This is not recommended for production environments, as it means that your documentation will be missing information gleaned from database metadata (the full set of columns in each table, and statistics about those tables). It can speed up `docs generate` in development, when you just want to visualize lineage and other information defined within your project. To learn how to build your documentation in dbt Cloud, refer to [build your docs in dbt Cloud](/docs/collaborate/build-and-view-your-docs#generating-documentation). +This is not recommended for production environments, as it means that your documentation will be missing information gleaned from database metadata (the full set of columns in each table, and statistics about those tables). It can speed up `docs generate` in development, when you just want to visualize lineage and other information defined within your project. To learn how to build your documentation in dbt Cloud, refer to [build your docs in dbt Cloud](/docs/collaborate/build-and-view-your-docs). **Example**: ``` diff --git a/website/docs/reference/dbt-jinja-functions/doc.md b/website/docs/reference/dbt-jinja-functions/doc.md index 51ca6ad2059..ee0b75b2e19 100644 --- a/website/docs/reference/dbt-jinja-functions/doc.md +++ b/website/docs/reference/dbt-jinja-functions/doc.md @@ -5,7 +5,7 @@ id: "doc" description: "Use the `doc` to reference docs blocks in description fields." --- -The `doc` function is used to reference docs blocks in the description field of schema.yml files. It is analogous to the `ref` function. For more information, consult the [Documentation guide](/docs/collaborate/documentation). +The `doc` function is used to reference docs blocks in the description field of schema.yml files. It is analogous to the `ref` function. For more information, consult the [Documentation guide](/docs/collaborate/build-and-view-your-docs). Usage: diff --git a/website/docs/reference/project-configs/docs-paths.md b/website/docs/reference/project-configs/docs-paths.md index 910cfbb0cce..51ff5c5ccca 100644 --- a/website/docs/reference/project-configs/docs-paths.md +++ b/website/docs/reference/project-configs/docs-paths.md @@ -13,7 +13,7 @@ docs-paths: [directorypath]
## Definition -Optionally specify a custom list of directories where [docs blocks](/docs/collaborate/documentation#docs-blocks) are located. +Optionally specify a custom list of directories where [docs blocks](/docs/build/documentation#docs-blocks) are located. ## Default diff --git a/website/docs/reference/resource-properties/description.md b/website/docs/reference/resource-properties/description.md index fee1d50aaf3..ce0c7c42074 100644 --- a/website/docs/reference/resource-properties/description.md +++ b/website/docs/reference/resource-properties/description.md @@ -157,7 +157,7 @@ A user-defined description. Can be used to document: - analyses, and analysis columns - macros, and macro arguments -These descriptions are used in the documentation website rendered by dbt (refer to [the documentation guide](/docs/collaborate/documentation) or [dbt Explorer](/docs/collaborate/explore-projects)). +These descriptions are used in the documentation website rendered by dbt (refer to [the documentation guide](/docs/build/documentation) or [dbt Explorer](/docs/collaborate/explore-projects)). Descriptions can include markdown, as well as the [`doc` jinja function](/reference/dbt-jinja-functions/doc). diff --git a/website/docs/terms/dag.md b/website/docs/terms/dag.md index 0216332d953..93e2956ebb3 100644 --- a/website/docs/terms/dag.md +++ b/website/docs/terms/dag.md @@ -79,7 +79,7 @@ Instead of manually auditing your DAG for best practices, the [dbt project evalu ## dbt and DAGs -The marketing team at dbt Labs would be upset with us if we told you we think dbt actually stood for “dag build tool,” but one of the key elements of dbt is its ability to generate documentation and infer relationships between models. And one of the hallmark features of [dbt Docs](https://docs.getdbt.com/docs/collaborate/documentation) is the Lineage Graph (DAG) of your dbt project. +The marketing team at dbt Labs would be upset with us if we told you we think dbt actually stood for “dag build tool,” but one of the key elements of dbt is its ability to generate documentation and infer relationships between models. And one of the hallmark features of [dbt Docs](https://docs.getdbt.com/docs/build/documentation) is the Lineage Graph (DAG) of your dbt project. Whether you’re using dbt Core or Cloud, dbt docs and the Lineage Graph are available to all dbt developers. The Lineage Graph in dbt Docs can show a model or source’s entire lineage, all within a visual frame. Clicking within a model, you can view the Lineage Graph and adjust selectors to only show certain models within the DAG. Analyzing the DAG here is a great way to diagnose potential inefficiencies or lack of modularity in your dbt project. diff --git a/website/docs/terms/dry.md b/website/docs/terms/dry.md index ec1c9229567..04b83642a08 100644 --- a/website/docs/terms/dry.md +++ b/website/docs/terms/dry.md @@ -26,7 +26,7 @@ WET, which stands for “Write Everything Twice,” is the opposite of DRY. It's Well, how would you know if your code isn't DRY enough? That’s kind of subjective and will vary by the norms set within your organization. That said, a good rule of thumb is [the Rule of Three](https://en.wikipedia.org/wiki/Rule_of_three_(writing)#:~:text=The%20rule%20of%20three%20is,or%20effective%20than%20other%20numbers.). This rule states that the _third_ time you encounter a certain pattern, you should probably abstract it into some reusable unit. -There is, of course, a tradeoff between simplicity and conciseness in code. The more abstractions you create, the harder it can be for others to understand and maintain your code without proper documentation. So, the moral of the story is: DRY code is great as long as you [write great documentation.](https://docs.getdbt.com/docs/collaborate/documentation) +There is, of course, a tradeoff between simplicity and conciseness in code. The more abstractions you create, the harder it can be for others to understand and maintain your code without proper documentation. So, the moral of the story is: DRY code is great as long as you [write great documentation.](https://docs.getdbt.com/docs/build/documentation) ### Save time & energy diff --git a/website/docs/terms/primary-key.md b/website/docs/terms/primary-key.md index fde3ff44ac7..c8fc327af0d 100644 --- a/website/docs/terms/primary-key.md +++ b/website/docs/terms/primary-key.md @@ -151,7 +151,7 @@ When we talk about testing our primary keys, we really mean testing their unique 2. For databases that don’t offer support and enforcement of primary keys, you’re going to need to regularly test that primary keys aren’t violating their golden rule of uniqueness and non-nullness. To do this, we recommend implementing a tool like dbt that allows you to define version-controlled and code-based tests on your data models. Using these tests, you should create [not null](https://docs.getdbt.com/reference/resource-properties/tests#not_null) and [unique](https://docs.getdbt.com/reference/resource-properties/tests#unique) tests for every primary key field throughout your dbt project. Other methods for primary key testing may look like writing custom tests or ad hoc queries that check for uniqueness and non-nullness. :::tip Tip -You can use dbt’s [documentation](https://docs.getdbt.com/docs/collaborate/documentation) and [testing](https://docs.getdbt.com/reference/resource-properties/tests) capabilities to clearly identify and QA primary keys in your data models. For your primary key column, you should mention that the field is the unique identifier for that table and test for uniqueness and non-nullness. +You can use dbt’s [documentation](https://docs.getdbt.com/docs/build/documentation) and [testing](https://docs.getdbt.com/reference/resource-properties/tests) capabilities to clearly identify and QA primary keys in your data models. For your primary key column, you should mention that the field is the unique identifier for that table and test for uniqueness and non-nullness. ::: ## Conclusion diff --git a/website/sidebars.js b/website/sidebars.js index 28fb32dfa5d..a087c24e3f7 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -301,6 +301,7 @@ const sidebarSettings = { "docs/build/unit-tests", ], }, + "docs/build/documentation", "docs/build/snapshots", "docs/build/seeds", "docs/build/jinja-macros", @@ -468,7 +469,7 @@ const sidebarSettings = { "docs/collaborate/collaborate-with-others", { type: "category", - label: "Explore dbt projects", + label: "Discover data with dbt Explorer", link: { type: "doc", id: "docs/collaborate/explore-projects" }, items: [ "docs/collaborate/explore-projects", @@ -494,10 +495,9 @@ const sidebarSettings = { }, { type: "category", - label: "Document your dbt projects", - link: { type: "doc", id: "docs/collaborate/documentation" }, + label: "Document your projects", + link: { type: "doc", id: "docs/collaborate/build-and-view-your-docs" }, items: [ - "docs/collaborate/documentation", "docs/collaborate/build-and-view-your-docs", ], }, diff --git a/website/snippets/tutorial-document-your-models.md b/website/snippets/tutorial-document-your-models.md index 9913dbcd1d7..736ce567d57 100644 --- a/website/snippets/tutorial-document-your-models.md +++ b/website/snippets/tutorial-document-your-models.md @@ -1,4 +1,4 @@ -Adding [documentation](/docs/collaborate/documentation) to your project allows you to describe your models in rich detail, and share that information with your team. Here, we're going to add some basic documentation to our project. +Adding [documentation](/docs/build/documentation) to your project allows you to describe your models in rich detail, and share that information with your team. Here, we're going to add some basic documentation to our project. 1. Update your `models/schema.yml` file to include some descriptions, such as those below. diff --git a/website/snippets/tutorial-next-steps-tests.md b/website/snippets/tutorial-next-steps-tests.md index 39764cede0a..26fd356e87b 100644 --- a/website/snippets/tutorial-next-steps-tests.md +++ b/website/snippets/tutorial-next-steps-tests.md @@ -2,4 +2,4 @@ Before moving on from testing, make a change and see how it affects your results * Write a test that fails, for example, omit one of the order statuses in the `accepted_values` list. What does a failing test look like? Can you debug the failure? * Run the tests for one model only. If you grouped your `stg_` models into a directory, try running the tests for all the models in that directory. -* Use a [docs block](/docs/collaborate/documentation#using-docs-blocks) to add a Markdown description to a model. +* Use a [docs block](/docs/build/documentation#using-docs-blocks) to add a Markdown description to a model. diff --git a/website/static/img/docs/dbt-cloud/access-explorer.gif b/website/static/img/docs/dbt-cloud/access-explorer.gif new file mode 100644 index 00000000000..5eaf04e4170 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/access-explorer.gif differ diff --git a/website/static/img/docs/dbt-cloud/dbt-docs-generate-command.png b/website/static/img/docs/dbt-cloud/dbt-docs-generate-command.png deleted file mode 100644 index 0ec0cae7d39..00000000000 Binary files a/website/static/img/docs/dbt-cloud/dbt-docs-generate-command.png and /dev/null differ diff --git a/website/static/img/docs/dbt-cloud/discovery-api/dbt-dag.jpg b/website/static/img/docs/dbt-cloud/discovery-api/dbt-dag.jpg deleted file mode 100644 index df05049d2a1..00000000000 Binary files a/website/static/img/docs/dbt-cloud/discovery-api/dbt-dag.jpg and /dev/null differ diff --git a/website/static/img/docs/dbt-cloud/explore-icon.jpg b/website/static/img/docs/dbt-cloud/explore-icon.jpg new file mode 100644 index 00000000000..d4912ecface Binary files /dev/null and b/website/static/img/docs/dbt-cloud/explore-icon.jpg differ diff --git a/website/static/img/docs/dbt-cloud/explore-nav.jpg b/website/static/img/docs/dbt-cloud/explore-nav.jpg new file mode 100644 index 00000000000..c0d26e58c56 Binary files /dev/null and b/website/static/img/docs/dbt-cloud/explore-nav.jpg differ diff --git a/website/static/img/docs/dbt-cloud/using-dbt-cloud/98c05c5-Screen_Shot_2019-02-08_at_9.18.22_PM.png b/website/static/img/docs/dbt-cloud/using-dbt-cloud/98c05c5-Screen_Shot_2019-02-08_at_9.18.22_PM.png deleted file mode 100644 index 25298f29fb3..00000000000 Binary files a/website/static/img/docs/dbt-cloud/using-dbt-cloud/98c05c5-Screen_Shot_2019-02-08_at_9.18.22_PM.png and /dev/null differ diff --git a/website/static/img/docs/dbt-cloud/using-dbt-cloud/data-sources.png b/website/static/img/docs/dbt-cloud/using-dbt-cloud/data-sources.png index 2c02b5e7bba..be7a96f7177 100644 Binary files a/website/static/img/docs/dbt-cloud/using-dbt-cloud/data-sources.png and b/website/static/img/docs/dbt-cloud/using-dbt-cloud/data-sources.png differ diff --git a/website/static/img/docs/dbt-cloud/using-dbt-cloud/edit-job-generate-artifacts.png b/website/static/img/docs/dbt-cloud/using-dbt-cloud/edit-job-generate-artifacts.png index 8741e3a66a2..8d7ab68646b 100644 Binary files a/website/static/img/docs/dbt-cloud/using-dbt-cloud/edit-job-generate-artifacts.png and b/website/static/img/docs/dbt-cloud/using-dbt-cloud/edit-job-generate-artifacts.png differ diff --git a/website/vercel.json b/website/vercel.json index d0660bb3dad..7935f0dbaca 100644 --- a/website/vercel.json +++ b/website/vercel.json @@ -2,6 +2,16 @@ "cleanUrls": true, "trailingSlash": false, "redirects": [ + { + "source": "/docs/collaborate/cloud-build-and-view-your-docs", + "destination": "/docs/collaborate/build-and-view-your-docs", + "permanent": true + }, + { + "source": "/docs/collaborate/documentation", + "destination":"/docs/build/documentation", + "permanent": true + }, { "source": "/docs/use-dbt-semantic-layer/tableau", "destination": "/docs/cloud-integrations/semantic-layer/tableau", @@ -1318,7 +1328,7 @@ }, { "source": "/docs/dbt-cloud/using-dbt-cloud/cloud-generating-documentation", - "destination": "/docs/collaborate/build-and-view-your-docs", + "destination": "/docs/collaborate/explore-projects", "permanent": true }, {