@@ -54,7 +54,7 @@ You can install [MetricFlow](https://github.com/dbt-labs/metricflow#getting-star
1. Create or activate your virtual environment `python -m venv venv`
2. Run `pip install dbt-metricflow`
- * You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `pip install "dbt-metricflow[snowflake]"`
+ * You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `python -m pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. For example, for a Snowflake adapter run `python -m pip install "dbt-metricflow[snowflake]"`
**Note**, you'll need to manage versioning between dbt Core, your adapter, and MetricFlow.
diff --git a/website/docs/docs/cloud/cloud-cli-installation.md b/website/docs/docs/cloud/cloud-cli-installation.md
index b945bede160..f3294477611 100644
--- a/website/docs/docs/cloud/cloud-cli-installation.md
+++ b/website/docs/docs/cloud/cloud-cli-installation.md
@@ -155,9 +155,9 @@ If you already have dbt Core installed, the dbt Cloud CLI may conflict. Here are
- Uninstall the dbt Cloud CLI using the command: `pip uninstall dbt`
- Reinstall dbt Core using the following command, replacing "adapter_name" with the appropriate adapter name:
```shell
- pip install dbt-adapter_name --force-reinstall
+ python -m pip install dbt-adapter_name --force-reinstall
```
- For example, if I used Snowflake as an adapter, I would run: `pip install dbt-snowflake --force-reinstall`
+ For example, if I used Snowflake as an adapter, I would run: `python -m pip install dbt-snowflake --force-reinstall`
--------
@@ -243,7 +243,7 @@ To update, follow the same process explained in [Windows](/docs/cloud/cloud-cli-
To update:
- Make sure you're in your virtual environment
-- Run `pip install --upgrade dbt`.
+- Run `python -m pip install --upgrade dbt`.
diff --git a/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md b/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md
index b387c64788f..e104ea8640c 100644
--- a/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md
+++ b/website/docs/docs/collaborate/cloud-build-and-view-your-docs.md
@@ -5,7 +5,7 @@ description: "Automatically generate project documentation as you run jobs."
pagination_next: null
---
-dbt enables you to generate documentation for your project and data warehouse, and renders the documentation in a website. For more information, see [Documentation](/docs/collaborate/documentation).
+dbt Cloud enables you to generate documentation for your project and data platform, rendering it as a website. The documentation is only updated with new information after a fully successful job run, ensuring accuracy and relevance. Refer to [Documentation](/docs/collaborate/documentation) for more details.
## Set up a documentation job
@@ -52,13 +52,15 @@ You configure project documentation to generate documentation when the job you s
To generate documentation in the dbt Cloud IDE, run the `dbt docs generate` command in the
Command Bar in the dbt Cloud IDE. This command will generate the Docs for your dbt project as it exists in development in your IDE session.
-
+
After generating your documentation, you can click the **Book** icon above the file tree, to see the latest version of your documentation rendered in a new browser window.
## Viewing documentation
-Once you set up a job to generate documentation for your project, you can click **Documentation** in the top left. Your project's documentation should open. This link will always navigate you to the most recent version of your project's documentation in dbt Cloud.
+Once you set up a job to generate documentation for your project, you can click **Documentation** in the top left. Your project's documentation should open. This link will always help you find the most recent version of your project's documentation in dbt Cloud.
+
+These generated docs always show the last fully successful run, which means that if you have any failed tasks, including tests, then you will not see changes to the docs by this run. If you don't see a fully successful run, then you won't see any changes to the documentation.
The dbt Cloud IDE makes it possible to view [documentation](/docs/collaborate/documentation)
for your dbt project while your code is still in development. With this workflow, you can inspect and verify what your project's generated documentation will look like before your changes are released to production.
diff --git a/website/docs/docs/connect-adapters.md b/website/docs/docs/connect-adapters.md
index e301cfc237e..6ccc1b4f376 100644
--- a/website/docs/docs/connect-adapters.md
+++ b/website/docs/docs/connect-adapters.md
@@ -15,7 +15,7 @@ Explore the fastest and most reliable way to deploy dbt using dbt Cloud, a hoste
Install dbt Core, an open-source tool, locally using the command line. dbt communicates with a number of different data platforms by using a dedicated adapter plugin for each. When you install dbt Core, you'll also need to install the specific adapter for your database, [connect to dbt Core](/docs/core/about-core-setup), and set up a `profiles.yml` file.
-With a few exceptions [^1], you can install all [Verified adapters](/docs/supported-data-platforms) from PyPI using `pip install adapter-name`. For example to install Snowflake, use the command `pip install dbt-snowflake`. The installation will include `dbt-core` and any other required dependencies, which may include both other dependencies and even other adapter plugins. Read more about [installing dbt](/docs/core/installation).
+With a few exceptions [^1], you can install all [Verified adapters](/docs/supported-data-platforms) from PyPI using `python -m pip install adapter-name`. For example to install Snowflake, use the command `python -m pip install dbt-snowflake`. The installation will include `dbt-core` and any other required dependencies, which may include both other dependencies and even other adapter plugins. Read more about [installing dbt](/docs/core/installation).
[^1]: Here are the two different adapters. Use the PyPI package name when installing with `pip`
diff --git a/website/docs/docs/core/connect-data-platform/hive-setup.md b/website/docs/docs/core/connect-data-platform/hive-setup.md
index 92210162324..33e45e28a0d 100644
--- a/website/docs/docs/core/connect-data-platform/hive-setup.md
+++ b/website/docs/docs/core/connect-data-platform/hive-setup.md
@@ -131,7 +131,7 @@ you must install the `dbt-hive` plugin.
The following commands will install the latest version of `dbt-hive` as well as the requisite version of `dbt-core` and `impyla` driver used for connections.
```
-pip install dbt-hive
+python -m pip install dbt-hive
```
### Supported Functionality
diff --git a/website/docs/docs/core/connect-data-platform/spark-setup.md b/website/docs/docs/core/connect-data-platform/spark-setup.md
index 3cface96440..e8d65153058 100644
--- a/website/docs/docs/core/connect-data-platform/spark-setup.md
+++ b/website/docs/docs/core/connect-data-platform/spark-setup.md
@@ -35,15 +35,15 @@ If connecting to a Spark cluster via the generic thrift or http methods, it requ
```zsh
# odbc connections
-$ pip install "dbt-spark[ODBC]"
+$ python -m pip install "dbt-spark[ODBC]"
# thrift or http connections
-$ pip install "dbt-spark[PyHive]"
+$ python -m pip install "dbt-spark[PyHive]"
```
```zsh
# session connections
-$ pip install "dbt-spark[session]"
+$ python -m pip install "dbt-spark[session]"
```
Configuring {frontMatter.meta.pypi_package}
diff --git a/website/docs/docs/core/connect-data-platform/starrocks-setup.md b/website/docs/docs/core/connect-data-platform/starrocks-setup.md
index e5c1abac037..485e1d18fb7 100644
--- a/website/docs/docs/core/connect-data-platform/starrocks-setup.md
+++ b/website/docs/docs/core/connect-data-platform/starrocks-setup.md
@@ -34,7 +34,7 @@ meta:
pip is the easiest way to install the adapter:
-pip install {frontMatter.meta.pypi_package}
+python -m pip install {frontMatter.meta.pypi_package}
Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
diff --git a/website/docs/docs/core/connect-data-platform/trino-setup.md b/website/docs/docs/core/connect-data-platform/trino-setup.md
index 199a6da8510..a7dc658358f 100644
--- a/website/docs/docs/core/connect-data-platform/trino-setup.md
+++ b/website/docs/docs/core/connect-data-platform/trino-setup.md
@@ -255,7 +255,7 @@ The only authentication parameter to set for OAuth 2.0 is `method: oauth`. If yo
For more information, refer to both [OAuth 2.0 authentication](https://trino.io/docs/current/security/oauth2.html) in the Trino docs and the [README](https://github.com/trinodb/trino-python-client#oauth2-authentication) for the Trino Python client.
-It's recommended that you install `keyring` to cache the OAuth 2.0 token over multiple dbt invocations by running `pip install 'trino[external-authentication-token-cache]'`. The `keyring` package is not installed by default.
+It's recommended that you install `keyring` to cache the OAuth 2.0 token over multiple dbt invocations by running `python -m pip install 'trino[external-authentication-token-cache]'`. The `keyring` package is not installed by default.
#### Example profiles.yml for OAuth
diff --git a/website/docs/docs/core/connect-data-platform/upsolver-setup.md b/website/docs/docs/core/connect-data-platform/upsolver-setup.md
index 6b2f410fc07..8e4203e0b0c 100644
--- a/website/docs/docs/core/connect-data-platform/upsolver-setup.md
+++ b/website/docs/docs/core/connect-data-platform/upsolver-setup.md
@@ -33,7 +33,7 @@ pagination_next: null
pip is the easiest way to install the adapter:
-pip install {frontMatter.meta.pypi_package}
+python -m pip install {frontMatter.meta.pypi_package}
Installing {frontMatter.meta.pypi_package}
will also install dbt-core
and any other dependencies.
diff --git a/website/docs/docs/core/docker-install.md b/website/docs/docs/core/docker-install.md
index dfb2a669e34..8de3bcb5c06 100644
--- a/website/docs/docs/core/docker-install.md
+++ b/website/docs/docs/core/docker-install.md
@@ -5,7 +5,7 @@ description: "You can use Docker to install dbt and adapter plugins from the com
dbt Core and all adapter plugins maintained by dbt Labs are available as [Docker](https://docs.docker.com/) images, and distributed via [GitHub Packages](https://docs.github.com/en/packages/learn-github-packages/introduction-to-github-packages) in a [public registry](https://github.com/dbt-labs/dbt-core/pkgs/container/dbt-core).
-Using a prebuilt Docker image to install dbt Core in production has a few benefits: it already includes dbt-core, one or more database adapters, and pinned versions of all their dependencies. By contrast, `pip install dbt-core dbt-` takes longer to run, and will always install the latest compatible versions of every dependency.
+Using a prebuilt Docker image to install dbt Core in production has a few benefits: it already includes dbt-core, one or more database adapters, and pinned versions of all their dependencies. By contrast, `python -m pip install dbt-core dbt-` takes longer to run, and will always install the latest compatible versions of every dependency.
You might also be able to use Docker to install and develop locally if you don't have a Python environment set up. Note that running dbt in this manner can be significantly slower if your operating system differs from the system that built the Docker image. If you're a frequent local developer, we recommend that you install dbt Core via [Homebrew](/docs/core/homebrew-install) or [pip](/docs/core/pip-install) instead.
diff --git a/website/docs/docs/core/pip-install.md b/website/docs/docs/core/pip-install.md
index 44fac00e493..e1a0e65312c 100644
--- a/website/docs/docs/core/pip-install.md
+++ b/website/docs/docs/core/pip-install.md
@@ -39,7 +39,7 @@ alias env_dbt='source /bin/activate'
Once you know [which adapter](/docs/supported-data-platforms) you're using, you can install it as `dbt-`. For example, if using Postgres:
```shell
-pip install dbt-postgres
+python -m pip install dbt-postgres
```
This will install `dbt-core` and `dbt-postgres` _only_:
@@ -62,7 +62,7 @@ All adapters build on top of `dbt-core`. Some also depend on other adapters: for
To upgrade a specific adapter plugin:
```shell
-pip install --upgrade dbt-
+python -m pip install --upgrade dbt-
```
### Install dbt-core only
@@ -70,7 +70,7 @@ pip install --upgrade dbt-
If you're building a tool that integrates with dbt Core, you may want to install the core library alone, without a database adapter. Note that you won't be able to use dbt as a CLI tool.
```shell
-pip install dbt-core
+python -m pip install dbt-core
```
### Change dbt Core versions
@@ -79,13 +79,13 @@ You can upgrade or downgrade versions of dbt Core by using the `--upgrade` optio
To upgrade dbt to the latest version:
```
-pip install --upgrade dbt-core
+python -m pip install --upgrade dbt-core
```
To downgrade to an older version, specify the version you want to use. This command can be useful when you're resolving package dependencies. As an example:
```
-pip install --upgrade dbt-core==0.19.0
+python -m pip install --upgrade dbt-core==0.19.0
```
### `pip install dbt`
@@ -95,7 +95,7 @@ Note that, as of v1.0.0, `pip install dbt` is no longer supported and will raise
If you have workflows or integrations that relied on installing the package named `dbt`, you can achieve the same behavior going forward by installing the same five packages that it used:
```shell
-pip install \
+python -m pip install \
dbt-core \
dbt-postgres \
dbt-redshift \
diff --git a/website/docs/docs/core/source-install.md b/website/docs/docs/core/source-install.md
index 42086159c03..d17adc13c53 100644
--- a/website/docs/docs/core/source-install.md
+++ b/website/docs/docs/core/source-install.md
@@ -17,10 +17,10 @@ To install `dbt-core` from the GitHub code source:
```shell
git clone https://github.com/dbt-labs/dbt-core.git
cd dbt-core
-pip install -r requirements.txt
+python -m pip install -r requirements.txt
```
-This will install `dbt-core` and `dbt-postgres`. To install in editable mode (includes your local changes as you make them), use `pip install -e editable-requirements.txt` instead.
+This will install `dbt-core` and `dbt-postgres`. To install in editable mode (includes your local changes as you make them), use `python -m pip install -e editable-requirements.txt` instead.
### Installing adapter plugins
@@ -29,12 +29,12 @@ To install an adapter plugin from source, you will need to first locate its sour
```shell
git clone https://github.com/dbt-labs/dbt-redshift.git
cd dbt-redshift
-pip install .
+python -m pip install .
```
You do _not_ need to install `dbt-core` before installing an adapter plugin -- the plugin includes `dbt-core` among its dependencies, and it will install the latest compatible version automatically.
-To install in editable mode, such as while contributing, use `pip install -e .` instead.
+To install in editable mode, such as while contributing, use `python -m pip install -e .` instead.
diff --git a/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md
index 9ebd3c64cf3..18863daba6f 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/00-upgrading-to-v1.7.md
@@ -32,6 +32,8 @@ This is a relatively small behavior change, but worth calling out in case you no
- Don't add a `freshness:` block.
- Explicitly set `freshness: null`
+Beginning with v1.7, running [`dbt deps`](/reference/commands/deps) creates or updates the `package-lock.yml` file in the _project_root_ where `packages.yml` is recorded. The `package-lock.yml` file contains a record of all packages installed and, if subsequent `dbt deps` runs contain no updated packages in `depenedencies.yml` or `packages.yml`, dbt-core installs from `package-lock.yml`.
+
## New and changed features and functionality
- [`dbt docs generate`](/reference/commands/cmd-docs) now supports `--select` to generate [catalog metadata](/reference/artifacts/catalog-json) for a subset of your project. Currently available for Snowflake and Postgres only, but other adapters are coming soon.
diff --git a/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md
index 3f45e44076c..c0ba804cd78 100644
--- a/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md
+++ b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md
@@ -45,7 +45,7 @@ Global project macros have been reorganized, and some old unused macros have bee
### Installation
- [Installation docs](/docs/supported-data-platforms) reflects adapter-specific installations
-- `pip install dbt` is no longer supported, and will raise an explicit error. Install the specific adapter plugin you need as `pip install dbt-`.
+- `python -m pip install dbt` is no longer supported, and will raise an explicit error. Install the specific adapter plugin you need as `python -m pip install dbt-`.
- `brew install dbt` is no longer supported. Install the specific adapter plugin you need (among Postgres, Redshift, Snowflake, or BigQuery) as `brew install dbt-`.
- Removed official support for python 3.6, which is reaching end of life on December 23, 2021
diff --git a/website/docs/docs/dbt-versions/core-versions.md b/website/docs/docs/dbt-versions/core-versions.md
index 2467f3c946b..c497401a17d 100644
--- a/website/docs/docs/dbt-versions/core-versions.md
+++ b/website/docs/docs/dbt-versions/core-versions.md
@@ -56,7 +56,7 @@ After a minor version reaches the end of its critical support period, one year a
### Future versions
-We aim to release a new minor "feature" every 3 months. _This is an indicative timeline ONLY._ For the latest information about upcoming releases, including their planned release dates and which features and fixes might be included in each, always consult the [`dbt-core` repository milestones](https://github.com/dbt-labs/dbt-core/milestones).
+For the latest information about upcoming releases, including planned release dates and which features and fixes might be included, consult the [`dbt-core` repository milestones](https://github.com/dbt-labs/dbt-core/milestones) and [product roadmaps](https://github.com/dbt-labs/dbt-core/tree/main/docs/roadmap).
## Best practices for upgrading
diff --git a/website/docs/docs/dbt-versions/release-notes/11-Feb-2023/feb-ide-updates.md b/website/docs/docs/dbt-versions/release-notes/11-Feb-2023/feb-ide-updates.md
index d52ad2d4081..64fa2026d04 100644
--- a/website/docs/docs/dbt-versions/release-notes/11-Feb-2023/feb-ide-updates.md
+++ b/website/docs/docs/dbt-versions/release-notes/11-Feb-2023/feb-ide-updates.md
@@ -13,7 +13,6 @@ Learn more about the [February changes](https://getdbt.slack.com/archives/C03SAH
## New features
- Support for custom node colors in the IDE DAG visualization
-- Autosave prototype is now available under feature flag. [Contact](mailto:cloud-ide-feedback@dbtlabs.com) the dbt Labs IDE team to try this out
- Ref autocomplete includes models from seeds and snapshots
- Prevent menus from getting cropped (git controls dropdown, file tree dropdown, build button, editor tab options)
- Additional option to access the file menu by right-clicking on the files and folders in the file tree
diff --git a/website/docs/faqs/Core/install-pip-best-practices.md b/website/docs/faqs/Core/install-pip-best-practices.md
index e36d58296ec..72360a52acc 100644
--- a/website/docs/faqs/Core/install-pip-best-practices.md
+++ b/website/docs/faqs/Core/install-pip-best-practices.md
@@ -30,6 +30,6 @@ Before installing dbt, make sure you have the latest versions:
```shell
-pip install --upgrade pip wheel setuptools
+python -m pip install --upgrade pip wheel setuptools
```
diff --git a/website/docs/guides/adapter-creation.md b/website/docs/guides/adapter-creation.md
index aa4819e73d0..8bf082b04a0 100644
--- a/website/docs/guides/adapter-creation.md
+++ b/website/docs/guides/adapter-creation.md
@@ -799,7 +799,7 @@ dbt-tests-adapter
```sh
-pip install -r dev_requirements.txt
+python -m pip install -r dev_requirements.txt
```
### Set up and configure pytest
diff --git a/website/docs/guides/codespace-qs.md b/website/docs/guides/codespace-qs.md
index 7712ed8f8e8..b28b0ddaacf 100644
--- a/website/docs/guides/codespace-qs.md
+++ b/website/docs/guides/codespace-qs.md
@@ -61,7 +61,7 @@ If you'd like to work with a larger selection of Jaffle Shop data, you can gener
1. Install the Python package called [jafgen](https://pypi.org/project/jafgen/). At the terminal's prompt, run:
```shell
- /workspaces/test (main) $ pip install jafgen
+ /workspaces/test (main) $ python -m pip install jafgen
```
1. When installation is done, run:
diff --git a/website/docs/guides/custom-cicd-pipelines.md b/website/docs/guides/custom-cicd-pipelines.md
index 672c6e6dab8..bd6d7617623 100644
--- a/website/docs/guides/custom-cicd-pipelines.md
+++ b/website/docs/guides/custom-cicd-pipelines.md
@@ -336,7 +336,7 @@ lint-project:
rules:
- if: $CI_PIPELINE_SOURCE == "push" && $CI_COMMIT_BRANCH != 'main'
script:
- - pip install sqlfluff==0.13.1
+ - python -m pip install sqlfluff==0.13.1
- sqlfluff lint models --dialect snowflake --rules L019,L020,L021,L022
# this job calls the dbt Cloud API to run a job
@@ -379,7 +379,7 @@ steps:
displayName: 'Use Python 3.7'
- script: |
- pip install requests
+ python -m pip install requests
displayName: 'Install python dependencies'
- script: |
@@ -434,7 +434,7 @@ pipelines:
- step:
name: Lint dbt project
script:
- - pip install sqlfluff==0.13.1
+ - python -m pip install sqlfluff==0.13.1
- sqlfluff lint models --dialect snowflake --rules L019,L020,L021,L022
'main': # override if your default branch doesn't run on a branch named "main"
diff --git a/website/docs/guides/dremio-lakehouse.md b/website/docs/guides/dremio-lakehouse.md
index 1c59c04d175..59da64a5f88 100644
--- a/website/docs/guides/dremio-lakehouse.md
+++ b/website/docs/guides/dremio-lakehouse.md
@@ -14,7 +14,7 @@ recently_updated: true
---
## Introduction
-This guide will demonstrate how to build a data lakehouse with dbt Core 1.5 or new and Dremio Cloud. You can simplify and optimize your data infrastructure with dbt's robust transformation framework and Dremio’s open and easy data lakehouse. The integrated solution empowers companies to establish a strong data and analytics foundation, fostering self-service analytics and enhancing business insights while simplifying operations by eliminating the necessity to write complex Extract, Transform, and Load (ETL) pipelines.
+This guide will demonstrate how to build a data lakehouse with dbt Core 1.5 or newer and Dremio Cloud. You can simplify and optimize your data infrastructure with dbt's robust transformation framework and Dremio’s open and easy data lakehouse. The integrated solution empowers companies to establish a strong data and analytics foundation, fostering self-service analytics and enhancing business insights while simplifying operations by eliminating the necessity to write complex Extract, Transform, and Load (ETL) pipelines.
### Prerequisites
diff --git a/website/docs/guides/set-up-ci.md b/website/docs/guides/set-up-ci.md
index 83362094ec6..89d7c5a14fa 100644
--- a/website/docs/guides/set-up-ci.md
+++ b/website/docs/guides/set-up-ci.md
@@ -167,7 +167,7 @@ jobs:
with:
python-version: "3.9"
- name: Install SQLFluff
- run: "pip install sqlfluff"
+ run: "python -m pip install sqlfluff"
- name: Lint project
run: "sqlfluff lint models --dialect snowflake"
@@ -204,7 +204,7 @@ lint-project:
rules:
- if: $CI_PIPELINE_SOURCE == "push" && $CI_COMMIT_BRANCH != 'main'
script:
- - pip install sqlfluff
+ - python -m pip install sqlfluff
- sqlfluff lint models --dialect snowflake
```
@@ -235,7 +235,7 @@ pipelines:
- step:
name: Lint dbt project
script:
- - pip install sqlfluff==0.13.1
+ - python -m pip install sqlfluff==0.13.1
- sqlfluff lint models --dialect snowflake --rules L019,L020,L021,L022
'main': # override if your default branch doesn't run on a branch named "main"
diff --git a/website/docs/guides/sl-migration.md b/website/docs/guides/sl-migration.md
index 0cfde742af2..c3cca81f68e 100644
--- a/website/docs/guides/sl-migration.md
+++ b/website/docs/guides/sl-migration.md
@@ -25,10 +25,10 @@ dbt Labs recommends completing these steps in a local dev environment (such as t
1. Create new Semantic Model configs as YAML files in your dbt project.*
1. Upgrade the metrics configs in your project to the new spec.*
1. Delete your old metrics file or remove the `.yml` file extension so they're ignored at parse time. Remove the `dbt-metrics` package from your project. Remove any macros that reference `dbt-metrics`, like `metrics.calculate()`. Make sure that any packages you’re using don't have references to the old metrics spec.
-1. Install the CLI with `pip install "dbt-metricflow[your_adapter_name]"`. For example:
+1. Install the CLI with `python -m pip install "dbt-metricflow[your_adapter_name]"`. For example:
```bash
- pip install "dbt-metricflow[snowflake]"
+ python -m pip install "dbt-metricflow[snowflake]"
```
**Note** - The MetricFlow CLI is not available in the IDE at this time. Support is coming soon.
diff --git a/website/docs/reference/resource-configs/target_schema.md b/website/docs/reference/resource-configs/target_schema.md
index 041f004e20c..9d459b32bad 100644
--- a/website/docs/reference/resource-configs/target_schema.md
+++ b/website/docs/reference/resource-configs/target_schema.md
@@ -74,7 +74,7 @@ Notes:
* Consider whether this use-case is right for you, as downstream `refs` will select from the `dev` version of a snapshot, which can make it hard to validate models that depend on snapshots (see above [FAQ](#faqs))
-
+
```sql
{{
diff --git a/website/snippets/_setup-pages-intro.md b/website/snippets/_setup-pages-intro.md
index 44cbbb1a0c2..5ded5ba5ebc 100644
--- a/website/snippets/_setup-pages-intro.md
+++ b/website/snippets/_setup-pages-intro.md
@@ -13,7 +13,7 @@
Installing {props.meta.pypi_package}
Use `pip` to install the adapter, which automatically installs `dbt-core` and any additional dependencies. Use the following command for installation:
-pip install {props.meta.pypi_package}
+python -m pip install {props.meta.pypi_package}
Configuring {props.meta.pypi_package}
diff --git a/website/snippets/_sl-test-and-query-metrics.md b/website/snippets/_sl-test-and-query-metrics.md
index 43ebd929cb3..2e9490f089d 100644
--- a/website/snippets/_sl-test-and-query-metrics.md
+++ b/website/snippets/_sl-test-and-query-metrics.md
@@ -48,8 +48,8 @@ The dbt Cloud CLI is strongly recommended to define and query metrics for your d
1. Install [MetricFlow](/docs/build/metricflow-commands) as an extension of a dbt adapter from PyPI.
2. Create or activate your virtual environment with `python -m venv venv` or `source your-venv/bin/activate`.
-3. Run `pip install dbt-metricflow`.
- - You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. As an example for a Snowflake adapter, run `pip install "dbt-metricflow[snowflake]"`.
+3. Run `python -m pip install dbt-metricflow`.
+ - You can install MetricFlow using PyPI as an extension of your dbt adapter in the command line. To install the adapter, run `python -m pip install "dbt-metricflow[your_adapter_name]"` and add the adapter name at the end of the command. As an example for a Snowflake adapter, run `python -m pip install "dbt-metricflow[snowflake]"`.
- You'll need to manage versioning between dbt Core, your adapter, and MetricFlow.
4. Run `dbt parse`. This allows MetricFlow to build a semantic graph and generate a `semantic_manifest.json`.
- This creates the file in your `/target` directory. If you're working from the Jaffle shop example, run `dbt seed && dbt run` before proceeding to ensure the data exists in your warehouse.