From 573b65ec1d72b73fd4797bf233dc91298a5d0fac Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Wed, 6 Dec 2023 07:33:33 -0500 Subject: [PATCH 01/22] deprecated content --- website/docs/docs/build/metrics.md | 696 ------------------ website/docs/docs/build/projects.md | 2 +- .../avail-sl-integrations.md | 25 - .../docs/use-dbt-semantic-layer/dbt-sl.md | 99 --- .../use-dbt-semantic-layer/quickstart-sl.md | 263 ------- .../docs/use-dbt-semantic-layer/setup-sl.md | 5 - .../use-dbt-semantic-layer/sl-architecture.md | 32 - .../docs/use-dbt-semantic-layer/tableau.md | 1 - website/vercel.json | 5 + 9 files changed, 6 insertions(+), 1122 deletions(-) delete mode 100644 website/docs/docs/build/metrics.md diff --git a/website/docs/docs/build/metrics.md b/website/docs/docs/build/metrics.md deleted file mode 100644 index 7afcb41c2e4..00000000000 --- a/website/docs/docs/build/metrics.md +++ /dev/null @@ -1,696 +0,0 @@ ---- -title: "Metrics" -id: "metrics" -description: "When you define metrics in dbt projects, you encode crucial business logic in tested, version-controlled code. The dbt metrics layer helps you standardize metrics within your organization." -keywords: - - dbt metrics layer -tags: [Metrics] ---- - -import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; - - - - - - -The dbt Semantic Layer has undergone a [significant revamp](https://www.getdbt.com/blog/dbt-semantic-layer-whats-next/), improving governance, introducing new APIs, and making it more efficient to define/query metrics. This revamp means the dbt_metrics package and the legacy Semantic Layer, available in dbt v1.5 or lower, are no longer supported and won't receive any code fixes. - -**What’s changed?**

-The dbt_metrics package has been [deprecated](https://docs.getdbt.com/blog/deprecating-dbt-metrics) and replaced with [MetricFlow](/docs/build/about-metricflow?version=1.6), a new framework for defining metrics in dbt. This means dbt_metrics is no longer supported after dbt v1.5 and won't receive any code fixes. We will also remove the dbt_metrics spec and docs when it's fully deprecated. - -**Who does this affect?**

-Anyone who uses the dbt_metrics package or is integrated with the legacy Semantic Layer. The new Semantic Layer is available to [Team or Enterprise](https://www.getdbt.com/pricing/) multi-tenant dbt Cloud plans [hosted in North America](/docs/cloud/about-cloud/regions-ip-addresses). You must be on dbt v1.6 or higher to access it. All users can define metrics using MetricFlow. Users on dbt Cloud Developer plans or dbt Core can only use it to define and test metrics locally, but can't dynamically query them with integrated tools. - -**What should you do?**

-If you've defined metrics using dbt_metrics or integrated with the legacy Semantic Layer, we **highly** recommend you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt v1.6 or higher to use MetricFlow or the new dbt Semantic Layer. To migrate to the new Semantic Layer, refer to the dedicated [migration guide](/guides/sl-migration) for more info. - - -
- - - - -A metric is an aggregation over a that supports zero or more dimensions. Some examples of metrics include: -- active users -- monthly recurring revenue (mrr) - -In v1.0, dbt supports metric definitions as a new node type. Like [exposures](exposures), metrics appear as nodes in the directed acyclic graph (DAG) and can be expressed in YAML files. Defining metrics in dbt projects encodes crucial business logic in tested, version-controlled code. Further, you can expose these metrics definitions to downstream tooling, which drives consistency and precision in metric reporting. - -### Benefits of defining metrics - -**Use metric specifications in downstream tools** -dbt's compilation context can access metrics via the [`graph.metrics` variable](graph). The [manifest artifact](manifest-json) includes metrics for downstream metadata consumption. - -**See and select dependencies** -As with Exposures, you can see everything that rolls up into a metric (`dbt ls -s +metric:*`), and visualize them in [dbt documentation](documentation). For more information, see "[The `metric:` selection method](node-selection/methods#the-metric-method)." - - - -## Defining a metric - -You can define metrics in `.yml` files nested under a `metrics:` key. Metric names must: -- contain only letters, numbers, and underscores (no spaces or special characters) -- begin with a letter -- contain no more than 250 characters - -For a short human-friendly name with title casing, spaces, and special characters, use the `label` property. - -### Example definition - - - - - -```yaml -# models/marts/product/schema.yml - -version: 2 - -models: - - name: dim_customers - ... - -metrics: - - name: rolling_new_customers - label: New Customers - model: ref('dim_customers') - [description](description): "The 14 day rolling count of paying customers using the product" - - calculation_method: count_distinct - expression: user_id - - timestamp: signup_date - time_grains: [day, week, month, quarter, year] - - dimensions: - - plan - - country - - window: - count: 14 - period: day - - filters: - - field: is_paying - operator: 'is' - value: 'true' - - field: lifetime_value - operator: '>=' - value: '100' - - field: company_name - operator: '!=' - value: "'Acme, Inc'" - - field: signup_date - operator: '>=' - value: "'2020-01-01'" - - # general properties - [config](resource-properties/config): - enabled: true | false - treat_null_values_as_zero: true | false - - [meta](resource-configs/meta): {team: Finance} -``` - - - - -```yaml -# models/marts/product/schema.yml - -version: 2 - -models: - - name: dim_customers - ... - -metrics: - - name: rolling_new_customers - label: New Customers - model: ref('dim_customers') - description: "The 14 day rolling count of paying customers using the product" - - type: count_distinct - sql: user_id - - timestamp: signup_date - time_grains: [day, week, month, quarter, year, all_time] - - dimensions: - - plan - - country - - filters: - - field: is_paying - operator: 'is' - value: 'true' - - field: lifetime_value - operator: '>=' - value: '100' - - field: company_name - operator: '!=' - value: "'Acme, Inc'" - - field: signup_date - operator: '>=' - value: "'2020-01-01'" - - meta: {team: Finance} -``` - - - - -:::caution - -- You cannot define metrics on [ephemeral models](https://docs.getdbt.com/docs/build/materializations#ephemeral). To define a metric, the materialization must have a representation in the data warehouse. - -::: - - -### Available properties -Metrics can have many declared **properties**, which define aspects of your metric. More information on [properties and configs can be found here](https://docs.getdbt.com/reference/configs-and-properties). - - - -| Field | Description | Example | Required? | -|-------------|-------------------------------------------------------------|---------------------------------|-----------| -| name | A unique identifier for the metric | new_customers | yes | -| model | The dbt model that powers this metric | dim_customers | yes (no for `derived` metrics)| -| label | A short for name / label for the metric | New Customers | yes | -| description | Long form, human-readable description for the metric | The number of customers who.... | no | -| calculation_method | The method of calculation (aggregation or derived) that is applied to the expression | count_distinct | yes | -| expression | The expression to aggregate/calculate over | user_id, cast(user_id as int) |yes | -| timestamp | The time-based component of the metric | signup_date | no yes | -| time_grains | One or more "grains" at which the metric can be evaluated. For more information, see the "Custom Calendar" section. | [day, week, month, quarter, year] | no yes | -| dimensions | A list of dimensions to group or filter the metric by | [plan, country] | no | -| window | A dictionary for aggregating over a window of time. Used for rolling metrics such as 14 day rolling average. Acceptable periods are: [`day`,`week`,`month`, `year`, `all_time`] | {count: 14, period: day} | no | -| filters | A list of filters to apply before calculating the metric | See below | no | -| config | [Optional configurations](https://github.com/dbt-labs/dbt_metrics#accepted-metric-configurations) for calculating this metric | {treat_null_values_as_zero: true} | no | -| meta | Arbitrary key/value store | {team: Finance} | no | - - - - - -| Field | Description | Example | Required? | -|-------------|-------------------------------------------------------------|---------------------------------|-----------| -| name | A unique identifier for the metric | new_customers | yes | -| model | The dbt model that powers this metric | dim_customers | yes (no for `derived` metrics)| -| label | A short for name / label for the metric | New Customers |yes | -| description | Long form, human-readable description for the metric | The number of customers who.... | no | -| type | The method of calculation (aggregation or derived) that is applied to the expression | count_distinct | yes | -| sql | The expression to aggregate/calculate over | user_id, cast(user_id as int) | yes | -| timestamp | The time-based component of the metric | signup_date | yes | -| time_grains | One or more "grains" at which the metric can be evaluated | [day, week, month, quarter, year, all_time] | yes | -| dimensions | A list of dimensions to group or filter the metric by | [plan, country] | no | -| filters | A list of filters to apply before calculating the metric | See below | no | -| meta | Arbitrary key/value store | {team: Finance} | no | - - - - -### Available calculation methods - - - -The method of calculation (aggregation or derived) that is applied to the expression. - - - - -The type of calculation (aggregation or expression) that is applied to the sql property. - - - -| Metric Calculation Method | Description | -|----------------|----------------------------------------------------------------------------| -| count | This metric type will apply the `count` aggregation to the specified field | -| count_distinct | This metric type will apply the `count` aggregation to the specified field, with an additional distinct statement inside the aggregation | -| sum | This metric type will apply the `sum` aggregation to the specified field | -| average | This metric type will apply the `average` aggregation to the specified field | -| min | This metric type will apply the `min` aggregation to the specified field | -| max | This metric type will apply the `max` aggregation to the specified field | -| median | This metric type will apply the `median` aggregation to the specified field, or an alternative `percentile_cont` aggregation if `median` is not available | -|derived expression | This metric type is defined as any _non-aggregating_ calculation of 1 or more metrics | - - - -### Derived Metrics -In v1.2, support was added for `derived` metrics (previously named `expression`), which are defined as non-aggregating calculations of 1 or more metrics. An example of this would be `{{metric('total_revenue')}} / {{metric('count_of_customers')}}`. - - By defining these metrics, you are able to create metrics like: -- ratios -- subtractions -- any arbitrary calculation - -As long as the two (or more) base metrics (metrics that comprise the `derived` metric) share the specified `time_grains` and `dimensions`, those attributes can be used in any downstream metrics macro. - -An example definition of an `derived` metric is: - - - - -```yaml -# models/marts/product/schema.yml -version: 2 - -models: - - name: dim_customers - ... - -metrics: - - name: average_revenue_per_customer - label: Average Revenue Per Customer - description: "The average revenue received per customer" - - calculation_method: derived - expression: "{{metric('total_revenue')}} / {{metric('count_of_customers')}}" - - timestamp: order_date - time_grains: [day, week, month, quarter, year, all_time] - dimensions: - - had_discount - - order_country - -``` - - - - - -### Expression Metrics -In v1.2, support was added for `expression` metrics, which are defined as non-aggregating calculations of 1 or more metrics. By defining these metrics, you are able to create metrics like: -- ratios -- subtractions -- any arbitrary calculation - -As long as the two+ base metrics (the metrics that comprise the `expression` metric) share the specified `time_grains` and `dimensions`, those attributes can be used in any downstream metrics macro. - -An example definition of an `expression` metric is: - - - - -```yaml -# models/marts/product/schema.yml -version: 2 - -models: - - name: dim_customers - ... - -metrics: - - name: average_revenue_per_customer - label: Average Revenue Per Customer - description: "The average revenue received per customer" - - type: expression - sql: "{{metric('total_revenue')}} / {{metric('count_of_customers')}}" - - timestamp: order_date - time_grains: [day, week, month, quarter, year, all_time] - dimensions: - - had_discount - - order_country - -``` - - -### Filters -Filters should be defined as a list of dictionaries that define predicates for the metric. Filters are combined using AND clauses. For more control, users can (and should) include the complex logic in the model powering the metric. - -All three properties (`field`, `operator`, `value`) are required for each defined filter. - -Note that `value` must be defined as a string in YAML, because it will be compiled into queries as part of a string. If your filter's value needs to be surrounded by quotes inside the query (e.g. text or dates), use `"'nested'"` quotes: - -```yml - filters: - - field: is_paying - operator: 'is' - value: 'true' - - field: lifetime_value - operator: '>=' - value: '100' - - field: company_name - operator: '!=' - value: "'Acme, Inc'" - - field: signup_date - operator: '>=' - value: "'2020-01-01'" -``` - -### Calendar -The dbt_metrics package contains a [basic calendar table](https://github.com/dbt-labs/dbt_metrics/blob/main/models/dbt_metrics_default_calendar.sql) that is created as part of your `dbt run`. It contains dates between 2010-01-01 and 2029-12-31. - -If you want to use a custom calendar, you can replace the default with any table which meets the following requirements: -- Contains a `date_day` column. -- Contains the following columns: `date_week`, `date_month`, `date_quarter`, `date_year`, or equivalents. -- Additional date columns need to be prefixed with `date_`, e.g. `date_4_5_4_month` for a 4-5-4 retail calendar date set. Dimensions can have any name (see following section). - -To do this, set the value of the `dbt_metrics_calendar_model` variable in your `dbt_project.yml` file: -```yaml -#dbt_project.yml -config-version: 2 -[...] -vars: - dbt_metrics_calendar_model: my_custom_calendar -``` - -#### Dimensions from calendar tables -You may want to aggregate metrics by a dimension in your custom calendar table, for example is_weekend. You can include this within the list of dimensions in the macro call without it needing to be defined in the metric definition. - -To do so, set a list variable at the project level called custom_calendar_dimension_list, as shown in the example below. - -```yaml -#dbt_project.yml -vars: - custom_calendar_dimension_list: ["is_weekend"] -``` - - - -### Configuration - -Metric nodes now accept `config` dictionaries like other dbt resources. Specify Metric configs in the metric yml itself, or for groups of metrics in the `dbt_project.yml` file. - - - - - - - - -```yml -version: 2 -metrics: - - name: config_metric - label: Example Metric with Config - model: ref(‘my_model’) - calculation_method: count - timestamp: date_field - time_grains: [day, week, month] - config: - enabled: true -``` - - - - - - - - -```yml -metrics: - your_project_name: - +enabled: true -``` - - - - - - - - -#### Accepted Metric Configurations - -The following is the list of currently accepted metric configs: - -| Config | Type | Accepted Values | Default Value | Description | -|--------|------|-----------------|---------------|-------------| -| `enabled` | boolean | True/False | True | Enables or disables a metric node. When disabled, dbt will not consider it as part of your project. | -| `treat_null_values_as_zero` | boolean | True/False | True | Controls the `coalesce` behavior for metrics. By default, when there are no observations for a metric, the output of the metric as well as [Period over Period](#secondary-calculations) secondary calculations will include a `coalesce({{ field }}, 0)` to return 0's rather than nulls. Setting this config to False instead returns `NULL` values. | - - - -## Querying Your Metric - -:::caution dbt_metrics is no longer supported -The dbt_metrics package has been deprecated and replaced with [MetricFlow](/docs/build/about-metricflow?version=1.6), a new way framework for defining metrics in dbt. This means dbt_metrics is no longer supported after dbt v1.5 and won't receive any code fixes. -::: - -You can dynamically query metrics directly in dbt and verify them before running a job in the deployment environment. To query your defined metric, you must have the [dbt_metrics package](https://github.com/dbt-labs/dbt_metrics) installed. Information on how to [install packages can be found here](https://docs.getdbt.com/docs/build/packages#how-do-i-add-a-package-to-my-project). - -Use the following [metrics package](https://hub.getdbt.com/dbt-labs/metrics/latest/) installation code in your packages.yml file and run `dbt deps` to install the metrics package: - - - -```yml -packages: - - package: dbt-labs/metrics - version: [">=1.3.0", "<1.4.0"] -``` - - - - - -```yml -packages: - - package: dbt-labs/metrics - version: [">=0.3.0", "<0.4.0"] -``` - - - -Once the package has been installed with `dbt deps`, make sure to run the `dbt_metrics_default_calendar` model as this is required for macros used to query metrics. More information on this, and additional calendar functionality, can be found in the [project README](https://github.com/dbt-labs/dbt_metrics#calendar). - -### Querying metrics with `metrics.calculate` -Use the `metrics.calculate` macro along with defined metrics to generate a SQL statement that runs the metric aggregation to return the correct metric dataset. Example below: - - - -```sql -select * -from {{ metrics.calculate( - metric('new_customers'), - grain='week', - dimensions=['plan', 'country'] -) }} -``` - - - -### Supported inputs -The example above doesn't display all the potential inputs you can provide to the macro. - -You may find some pieces of functionality, like secondary calculations, complicated to use. We recommend reviewing the [package README](https://github.com/dbt-labs/dbt_metrics) for more in-depth information about each of the inputs that are not covered in the table below. - - -| Input | Example | Description | Required | -| ----------- | ----------- | ----------- | -----------| -| metric_list | `metric('some_metric)'`,
[`metric('some_metric)'`,
`metric('some_other_metric)'`]
| The metric(s) to be queried by the macro. If multiple metrics required, provide in list format. | Required | -| grain | `'day'`, `'week'`,
`'month'`, `'quarter'`,
`'year'`
| The time grain that the metric will be aggregated to in the returned dataset | Optional | -| dimensions | [`'plan'`,
`'country'`] | The dimensions you want the metric to be aggregated by in the returned dataset | Optional | -| secondary_calculations | [`metrics.period_over_period( comparison_strategy="ratio", interval=1, alias="pop_1wk")`] | Performs the specified secondary calculation on the metric results. Examples include period over period calculations, rolling calculations, and period to date calculations. | Optional | -| start_date | `'2022-01-01'` | Limits the date range of data used in the metric calculation by not querying data before this date | Optional | -| end_date | `'2022-12-31'` | Limits the date range of data used in the metric calculation by not querying data after this date | Optional | -| where | `plan='paying_customer'` | A sql statement, or series of sql statements, that alter the **final** CTE in the generated sql. Most often used to limit the data to specific values of dimensions provided | Optional | - -### Secondary Calculations -Secondary calculations are window functions you can add to the metric calculation and perform on the primary metric or metrics. - -You can use them to compare values to an earlier period, calculate year-to-date sums, and return rolling averages. You can add custom secondary calculations into dbt projects - for more information on this, reference the [package README](https://github.com/dbt-labs/dbt_metrics#secondary-calculations). - -The supported Secondary Calculations are: - -#### Period over Period: - -The period over period secondary calculation performs a calculation against the metric(s) in question by either determining the difference or the ratio between two points in time. The input variable, which looks at the grain selected in the macro, determines the other point. - -| Input | Example | Description | Required | -| -------------------------- | ----------- | ----------- | -----------| -| `comparison_strategy` | `ratio` or `difference` | How to calculate the delta between the two periods | Yes | -| `interval` | 1 | Integer - the number of time grains to look back | Yes | -| `alias` | `week_over_week` | The column alias for the resulting calculation | No | -| `metric_list` | `base_sum_metric` | List of metrics that the secondary calculation should be applied to. Default is all metrics selected | No | - -#### Period to Date: - -The period to date secondary calculation performs an aggregation on a defined period of time that is equal to or higher than the grain selected. For example, when you want to display a month_to_date value alongside your weekly grained metric. - -| Input | Example | Description | Required | -| -------------------------- | ----------- | ----------- | -----------| -| `aggregate` | `max`, `average` | The aggregation to use in the window function. Options vary based on the primary aggregation and are enforced in [validate_aggregate_coherence()](https://github.com/dbt-labs/dbt_metrics/blob/main/macros/validation/validate_aggregate_coherence.sql). | Yes | -| `period` | `"day"`, `"week"` | The time grain to aggregate to. One of [`"day"`, `"week"`, `"month"`, `"quarter"`, `"year"`]. Must be at equal or coarser (higher, more aggregated) granularity than the metric's grain (see [Time Grains](#time-grains) below). In example grain of `month`, the acceptable periods would be `month`, `quarter`, or `year`. | Yes | -| `alias` | `month_to_date` | The column alias for the resulting calculation | No | -| `metric_list` | `base_sum_metric` | List of metrics that the secondary calculation should be applied to. Default is all metrics selected | No | - -#### Rolling: - - - -The rolling secondary calculation performs an aggregation on a number of rows in metric dataset. For example, if the user selects the `week` grain and sets a rolling secondary calculation to `4` then the value returned will be a rolling 4 week calculation of whatever aggregation type was selected. If the `interval` input is not provided then the rolling caclulation will be unbounded on all preceding rows. - -| Input | Example | Description | Required | -| -------------------------- | ----------- | ----------- | -----------| -| `aggregate` | `max`, `average` | The aggregation to use in the window function. Options vary based on the primary aggregation and are enforced in [validate_aggregate_coherence()](https://github.com/dbt-labs/dbt_metrics/blob/main/macros/validation/validate_aggregate_coherence.sql). | Yes | -| `interval` | 1 | Integer - the number of time grains to look back | No | -| `alias` | `month_to_date` | The column alias for the resulting calculation | No | -| `metric_list` | `base_sum_metric` | List of metrics that the secondary calculation should be applied to. Default is all metrics selected | No | - - - - -The rolling secondary calculation performs an aggregation on a number of rows in the metric dataset. For example, if the user selects the `week` grain and sets a rolling secondary calculation to `4`, then the value returned will be a rolling 4-week calculation of whatever aggregation type was selected. - -| Input | Example | Description | Required | -| -------------------------- | ----------- | ----------- | -----------| -| `aggregate` | `max`, `average` | The aggregation to use in the window function. Options vary based on the primary aggregation and are enforced in [validate_aggregate_coherence()](https://github.com/dbt-labs/dbt_metrics/blob/main/macros/validation/validate_aggregate_coherence.sql). | Yes | -| `interval` | 1 | Integer - the number of time grains to look back | Yes | -| `alias` | `month_to_date` | The column alias for the resulting calculation | No | -| `metric_list` | `base_sum_metric` | List of metrics that the secondary calculation should be applied to. Default is all metrics selected | No | - - - - -#### Prior: -The prior secondary calculation returns the value from a specified number of intervals before the row. - -| Input | Example | Description | Required | -| -------------------------- | ----------- | ----------- | -----------| -| `interval` | 1 | Integer - the number of time grains to look back | Yes | -| `alias` | `2_weeks_prior` | The column alias for the resulting calculation | No | -| `metric_list` | `base_sum_metric` | List of metrics that the secondary calculation should be applied to. Default is all metrics selected | No | - - - -### Developing metrics with `metrics.develop` - - - -There may be times you want to test what a metric might look like before defining it in your project. In these cases, use the `develop` metric, which allows you to provide metric(s) in a contained yml so you can simulate what a defined metric might look like in your project. - -```sql -{% set my_metric_yml -%} -{% raw %} - -metrics: - -- The name of the metric does not need to be develop_metric - - name: develop_metric - model: ref('fact_orders') - label: Total Discount ($) - timestamp: order_date - time_grains: [day, week, month, quarter, year, all_time] - calculation_method: average - expression: discount_total - dimensions: - - had_discount - - order_country - -{% endraw %} -{%- endset %} - -select * -from {{ metrics.develop( - develop_yml=my_metric_yml, - metric_list=['develop_metric'], - grain='month' - ) - }} -``` - -**Important caveat** - The metric list input for the `metrics.develop` macro takes in the metric names themselves, not the `metric('name')` statement that the `calculate` macro uses. Using the example above: - -- ✅ `['develop_metric']` -- ❌ `[metric('develop_metric')]` - - - - - -There may be times you want to test what a metric might look like before defining it in your project. In these cases, the `develop` metric, which allows you to provide a single metric in a contained yml so you can simulate what a defined metric might look like in your project. - - -```sql -{% set my_metric_yml -%} -{% raw %} - -metrics: - - name: develop_metric - model: ref('fact_orders') - label: Total Discount ($) - timestamp: order_date - time_grains: [day, week, month, quarter, year, all_time] - type: average - sql: discount_total - dimensions: - - had_discount - - order_country - -{% endraw %} -{%- endset %} - -select * -from {{ metrics.develop( - develop_yml=my_metric_yml, - grain='month' - ) - }} -``` - - - -#### Multiple/Derived Metrics with `metrics.develop` -If you have a more complicated use case that you are interested in testing, the develop macro also supports this behavior. The only caveat is that you must include the raw tags for any provided metric yml that contains a derived metric. Example below: - -``` -{% set my_metric_yml -%} -{% raw %} - -metrics: - - name: develop_metric - model: ref('fact_orders') - label: Total Discount ($) - timestamp: order_date - time_grains: [day, week, month] - calculation_method: average - expression: discount_total - dimensions: - - had_discount - - order_country - - - name: derived_metric - label: Total Discount ($) - timestamp: order_date - time_grains: [day, week, month] - calculation_method: derived - expression: "{{ metric('develop_metric') }} - 1 " - dimensions: - - had_discount - - order_country - - - name: some_other_metric_not_using - label: Total Discount ($) - timestamp: order_date - time_grains: [day, week, month] - calculation_method: derived - expression: "{{ metric('derived_metric') }} - 1 " - dimensions: - - had_discount - - order_country - -{% endraw %} -{%- endset %} - -select * -from {{ metrics.develop( - develop_yml=my_metric_yml, - metric_list=['derived_metric'] - grain='month' - ) - }} -``` - -The above example will return a dataset that contains the metric provided in the metric list (`derived_metric`) and the parent metric (`develop_metric`). It will not contain `some_other_metric_not_using` as it is not designated in the metric list or a parent of the metrics included. - -**Important caveat** - You _must_ wrap the `expression` property for `derived` metrics in double quotes to render it. For example, `expression: "{{ metric('develop_metric') }} - 1 "`. - -
- - - - - diff --git a/website/docs/docs/build/projects.md b/website/docs/docs/build/projects.md index a54f6042cce..829245de6fc 100644 --- a/website/docs/docs/build/projects.md +++ b/website/docs/docs/build/projects.md @@ -19,7 +19,7 @@ At a minimum, all a project needs is the `dbt_project.yml` project configuration | [docs](/docs/collaborate/documentation) | Docs for your project that you can build. | | [sources](/docs/build/sources) | A way to name and describe the data loaded into your warehouse by your Extract and Load tools. | | [exposures](/docs/build/exposures) | A way to define and describe a downstream use of your project. | -| [metrics](/docs/build/metrics) | A way for you to define metrics for your project. | +| [metrics](/docs/build/build-metrics-intro) | A way for you to define metrics for your project. | | [groups](/docs/build/groups) | Groups enable collaborative node organization in restricted collections. | | [analysis](/docs/build/analyses) | A way to organize analytical SQL queries in your project such as the general ledger from your QuickBooks. | diff --git a/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md b/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md index be02fedb230..045838602a9 100644 --- a/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md +++ b/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md @@ -9,9 +9,6 @@ meta: api_name: dbt Semantic Layer APIs --- - - - There are a number of data applications that seamlessly integrate with the dbt Semantic Layer, powered by MetricFlow, from business intelligence tools to notebooks, spreadsheets, data catalogs, and more. These integrations allow you to query and unlock valuable insights from your data ecosystem. Use the [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview) to simplify metric queries, optimize your development workflow, and reduce coding. This approach also ensures data governance and consistency for data consumers. @@ -34,25 +31,3 @@ import AvailIntegrations from '/snippets/_sl-partner-links.md'; - [dbt Semantic Layer API query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata) - [Hex dbt Semantic Layer cells](https://learn.hex.tech/docs/logic-cell-types/transform-cells/dbt-metrics-cells) to set up SQL cells in Hex. - [Resolve 'Failed APN'](/faqs/Troubleshooting/sl-alpn-error) error when connecting to the dbt Semantic Layer. - - - - - -import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; - - - -A wide variety of data applications across the modern data stack natively integrate with the dbt Semantic Layer and dbt metrics — from Business Intelligence tools to notebooks, data catalogs, and more. - -The dbt Semantic Layer integrations are capable of querying dbt metrics, importing definitions, surfacing the underlying data in partner tools, and leveraging the dbt Server. - -For information on the partner integrations, their documentation, and more — refer to the [dbt Semantic Layer integrations](https://www.getdbt.com/product/semantic-layer-integrations) page. - - - -## Related docs - -- [About the dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl) - - diff --git a/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md b/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md index 8387e934d84..cde9ae4afbb 100644 --- a/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md @@ -9,9 +9,6 @@ pagination_next: "docs/use-dbt-semantic-layer/quickstart-sl" pagination_prev: null --- - - - The dbt Semantic Layer, powered by [MetricFlow](/docs/build/about-metricflow), simplifies the process of defining and using critical business metrics, like `revenue` in the modeling layer (your dbt project). By centralizing metric definitions, data teams can ensure consistent self-service access to these metrics in downstream data tools and applications. The dbt Semantic Layer eliminates duplicate coding by allowing data teams to define metrics on top of existing models and automatically handles data joins. @@ -62,99 +59,3 @@ plan="dbt Cloud Team or Enterprise" icon="dbt-bit"/> - - - - - -import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; - - - -The dbt Semantic Layer allows your data teams to centrally define essential business metrics like `revenue`, `customer`, and `churn` in the modeling layer (your dbt project) for consistent self-service within downstream data tools like BI and metadata management solutions. The dbt Semantic Layer provides the flexibility to define metrics on top of your existing models and then query those metrics and models in your analysis tools of choice. - -Resulting in less duplicate coding for data teams and more consistency for data consumers. - -The dbt Semantic Layer has these main parts: - -- Define your metrics in version-controlled dbt project code using [MetricFlow](/docs/build/about-metricflow) - * dbt_metrics is now deprecated -- Import your metric definitions using the [Discovery API](/docs/dbt-cloud-apis/discovery-api) -- Query your metric data with the dbt Proxy Server -- Explore and analyze dbt metrics in downstream tools - -### What makes the dbt Semantic Layer different? - -The dbt Semantic Layer reduces code duplication and inconsistency regarding your business metrics. By moving metric definitions out of the BI layer and into the modeling layer, your data teams can feel confident that different business units are working from the same metric definitions, regardless of their tool of choice. If a metric definition changes in dbt, it’s refreshed everywhere it’s invoked and creates consistency across all applications. You can also use the dbt Semantic Layer to query models and use macros. - - -## Prerequisites - - - - - - -## Manage metrics - -:::info 📌 - -New to dbt or metrics? Check out our [quickstart guide](/guides) to build your first dbt project! If you'd like to define your first metrics, try our [Jaffle Shop](https://github.com/dbt-labs/jaffle_shop_metrics) example project. - -::: - -If you're not sure whether to define a metric in dbt or not, ask yourself the following: - -> *Is this something our teams consistently need to report on?* - -An important business metric should be: - -- Well-defined (the definition is agreed upon throughout the entire organization) -- Time-bound (able to be compared across time) - -A great example of this is **revenue**. It can be aggregated on multiple levels (weekly, monthly, and so on) and is key for the broader business to understand. - -- ✅ `Monthly recurring revenue` or `Weekly active users` or `Average order value` -- ❌ `1-off experimental metric` - - -### Design and define metrics - -You can design and define your metrics in `.yml` files nested under a metrics key in your dbt project. For more information, refer to these docs:
- -- [dbt metrics](docs/build/metrics) for in-depth detail on attributes, filters, how to define and query your metrics, and [dbt-metrics package](https://github.com/dbt-labs/dbt_metrics) -- [dbt Semantic Layer quickstart](/docs/use-dbt-semantic-layer/quickstart-semantic-layer) to get started - -## Related questions - -
- How do I migrate from the legacy Semantic Layer to the new one? -
-
If you're using the legacy Semantic Layer, we highly recommend you upgrade your dbt version to dbt v1.6 or higher to use the new dbt Semantic Layer. Refer to the dedicated migration guide for more info.
-
-
- -
- How are you storing my data? -
-
The dbt Semantic Layer doesn't store, cache, or log your data. On each query to the Semantic Layer, the resulting data passes through dbt Cloud servers where it's never stored, cached, or logged. The data from your data platform gets routed through dbt Cloud servers to your connecting data tool.
-
-
-
- Is the dbt Semantic Layer open source? -
-
Some components of the dbt Semantic Layer are open source like dbt-core, the dbt_metrics package, and the BSL-licensed dbt-server. The dbt Proxy Server (what is actually compiling the dbt code) and the Discovery API are not open source.



- -During Public Preview, the dbt Semantic Layer is open to all dbt Cloud tiers — Developer, Team, and Enterprise.



- -
-
-
- Is there a dbt Semantic Layer discussion hub? -
-
Yes, absolutely! Join the dbt Slack community and #dbt-cloud-semantic-layer slack channel for all things related to the dbt Semantic Layer. -
-
-
-

-
diff --git a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md index 62437f4ecd6..19fcc4b9eda 100644 --- a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md @@ -8,8 +8,6 @@ meta: api_name: dbt Semantic Layer APIs --- - - import CreateModel from '/snippets/_sl-create-semanticmodel.md'; import DefineMetrics from '/snippets/_sl-define-metrics.md'; @@ -98,264 +96,3 @@ import SlFaqs from '/snippets/_sl-faqs.md'; - [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) - Demo on [how to define and query metrics with MetricFlow](https://www.loom.com/share/60a76f6034b0441788d73638808e92ac?sid=861a94ac-25eb-4fd8-a310-58e159950f5a) - [Billing](/docs/cloud/billing) - - - - - -import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; - - - -To try out the features of the dbt Semantic Layer, you first need to have a dbt project set up. This quickstart guide will lay out the following steps, and recommends a workflow that demonstrates some of its essential features: - -- Install dbt metrics package - * Note: this package will be deprecated very soon and we highly recommend you to use the new [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl?version=1.6), available in dbt v 1.6 or higher. -- Define metrics -- Query, and run metrics -- Configure the dbt Semantic Layer - -## Prerequisites - -To use the dbt Semantic Layer, you’ll need to meet the following: - - - - - - -:::info 📌 - -New to dbt or metrics? Check out our [quickstart guide](/guides) to build your first dbt project! If you'd like to define your first metrics, try our [Jaffle Shop](https://github.com/dbt-labs/jaffle_shop_metrics) example project. - -::: - -## Installing dbt metrics package - -The dbt Semantic Layer supports the calculation of metrics by using the [dbt metrics package](https://hub.getdbt.com/dbt-labs/metrics/latest/). You can install the dbt metrics package in your dbt project by copying the below code blocks. - - - -```yml -packages: - - package: dbt-labs/metrics - version: [">=1.3.0", "<1.4.0"] -``` - - - - - -```yml -packages: - - package: dbt-labs/metrics - version: [">=0.3.0", "<0.4.0"] -``` - - - - -1. Paste the dbt metrics package code in your `packages.yml` file. -2. Run the [`dbt deps` command](/reference/commands/deps) to install the package. -3. If you see a successful result, you have now installed the dbt metrics package successfully! -4. If you have any errors during the `dbt deps` command run, review the system logs for more information on how to resolve them. Make sure you use a dbt metrics package that’s compatible with your dbt environment version. - - - -## Design and define metrics - -Review our helpful metrics video below, which explains what metrics are, why they're important and how you can get started: - - - -Now that you've organized your metrics folder and files, you can define your metrics in `.yml` files nested under a `metrics` key. - -1. Add the metric definitions found in the [Jaffle Shop](https://github.com/dbt-labs/jaffle_shop_metrics) example to your dbt project. For example, to add an expenses metric, reference the following metrics you can define directly in your metrics folder: - - - -```sql -version: 2 - -metrics: - - name: expenses - label: Expenses - model: ref('orders') - description: "The total expenses of our jaffle business" - - calculation_method: sum - expression: amount / 4 - - timestamp: order_date - time_grains: [day, week, month, year] - - dimensions: - - customer_status - - had_credit_card_payment - - had_coupon_payment - - had_bank_transfer_payment - - had_gift_card_payment - - filters: - - field: status - operator: '=' - value: "'completed'" -``` - - - - -```sql -version: 2 - -metrics: - - name: expenses - label: Expenses - model: ref('orders') - description: "The total expenses of our jaffle business" - - type: sum - sql: amount / 4 - - timestamp: order_date - time_grains: [day, week, month, year] - - dimensions: - - customer_status - - had_credit_card_payment - - had_coupon_payment - - had_bank_transfer_payment - - had_gift_card_payment - - filters: - - field: status - operator: '=' - value: "'completed'" -``` - - -1. Click **Save** and then **Compile** the code. -2. Commit and merge the code changes that contain the metric definitions. -3. If you'd like to further design and define your own metrics, review the following documentation: - - - [dbt metrics](/docs/build/metrics) will provide you in-depth detail on attributes, properties, filters, and how to define and query metrics. - -## Develop and query metrics - -You can dynamically develop and query metrics directly in dbt and verify their accuracy _before_ running a job in the deployment environment by using the `metrics.calculate` and `metrics.develop` macros. - -To understand when and how to use the macros above, review [dbt metrics](/docs/build/metrics) and make sure you install the [dbt_metrics package](https://github.com/dbt-labs/dbt_metrics) first before using the above macros. - -:::info 📌 - -**Note:** You will need access to dbt Cloud and the dbt Semantic Layer from your integrated partner tool of choice. - -::: - -## Run your production job - -Once you’ve defined metrics in your dbt project, you can perform a job run in your deployment environment to materialize your metrics. The deployment environment is only supported for the dbt Semantic Layer at this moment. - -1. Go to **Deploy** in the navigation and select **Jobs** to re-run the job with the most recent code in the deployment environment. -2. Your metric should appear as a red node in the dbt Cloud IDE and dbt directed acyclic graphs (DAG). - - - - -**What’s happening internally?** - -- Merging the code into your main branch allows dbt Cloud to pull those changes and builds the definition in the manifest produced by the run. -- Re-running the job in the deployment environment helps materialize the models, which the metrics depend on, in the data platform. It also makes sure that the manifest is up to date. -- Your dbt Discovery API pulls in the most recent manifest and allows your integration information to extract metadata from it. - -## Set up dbt Semantic Layer - - - - -## Troubleshooting - -If you're encountering some issues when defining your metrics or setting up the dbt Semantic Layer, check out a list of answers to some of the questions or problems you may be experiencing. - -
- How are you storing my data? -
-
The dbt Semantic Layer does not store, or cache, or log your data. On each query to the Semantic Layer, the resulting data passes through dbt Cloud servers where it is never stored, cached, or logged. The data from your data platform gets routed through dbt Cloud servers, to your connecting data tool.
-
-
-
- Is the dbt Semantic Layer open source? -
-
Some components of the dbt Semantic Layer are open source like dbt-core, the dbt_metrics package, and the BSL-licensed dbt-server. The dbt Proxy Server (what is actually compiling the dbt code) and the Discovery API are not open sources.



- -During Public Preview, the dbt Semantic Layer is open to all dbt Cloud tiers (Developer, Team, and Enterprise).



-
    -
  • dbt Core users can define metrics in their dbt Core projects and calculate them using macros from the metrics package. To use the dbt Semantic Layer integrations, you will need to have a dbt Cloud account.




  • -
  • Developer accounts will be able to query the Proxy Server using SQL, but will not be able to browse pre-populated dbt metrics in external tools, which requires access to the Discovery API.




  • -
  • Team and Enterprise accounts will be able to set up the Semantic Layer and Discovery API in the integrated partner tool to import metric definitions.
  • -
-
-
-
-
- The dbt_metrics_calendar_table does not exist or is not authorized? -
-
All metrics queries are dependent on either the dbt_metrics_calendar_table or a custom calendar set in the users dbt_project.yml. If you have not created this model in the database, these queries will fail and you'll most likely see the following error message: - -Object DATABASE.SCHEMA.DBT_METRICS_DEFAULT_CALENDAR does not exist or not authorized.

- -Fix: - -
    -
  • If developing locally, run dbt run --select dbt_metrics_default_calendar


  • -
  • If you are using this in production, make sure that you perform a full dbt build or dbt run. If you are running specific selects in your production job, then you will not create this required model.
  • -
-
-
-
-
- Ephemeral Models - Object does not exist or is not authorized -
-
Metrics cannot be defined on ephemeral models because we reference the underlying table in the query that generates the metric so we need the table/view to exist in the database. If your table/view does not exist in your database, you might see this error message: - - Object 'DATABASE.SCHEMA.METRIC_MODEL_TABLE' does not exist or not authorized.

- -Fix: -
    -
  • You will need to materialize the model that the metric is built on as a table/view/incremental.
  • -
-
-
-
- -
- Mismatched Versions - metric type is ‘’ -
-
If you’re running dbt_metrics ≥v0.3.2 but have dbt-core version ≥1.3.0, you’ll likely see these error messages: - -
    -
  • Error message 1: The metric NAME also references ... but its type is ''. Only metrics of type expression can reference other metrics.
  • -
  • Error message 2: Unknown aggregation style: > in macro default__gen_primary_metric_aggregate (macros/sql_gen/gen_primary_metric_aggregate.sql)
  • -
-The reason you're experiencing this error is because we changed the type property of the metric spec in dbt-core v1.3.0. The new name is calculation_method and the package reflects that new name, so it isn’t finding any type when we try and run outdated code on it.

- -Fix: - - - -
-
-
-

- - -## Next steps - -Are you ready to define your own metrics and bring consistency to data consumers? Review the following documents to understand how to structure, define, and query metrics, and set up the dbt Semantic Layer: - -- [dbt metrics](/docs/build/metrics) for in-depth detail on attributes, properties, filters, and how to define and query metrics -- [dbt Server repo](https://github.com/dbt-labs/dbt-server), which is a persisted HTTP server that wraps dbt core to handle RESTful API requests for dbt operations. - -
diff --git a/website/docs/docs/use-dbt-semantic-layer/setup-sl.md b/website/docs/docs/use-dbt-semantic-layer/setup-sl.md index 33f1f43f614..fb7b853ad24 100644 --- a/website/docs/docs/use-dbt-semantic-layer/setup-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/setup-sl.md @@ -6,9 +6,6 @@ sidebar_label: "Set up your Semantic Layer" tags: [Semantic Layer] --- - - - With the dbt Semantic Layer, you can centrally define business metrics, reduce code duplication and inconsistency, create self-service in downstream tools, and more. Configure the dbt Semantic Layer in dbt Cloud to connect with your integrated partner tool. ## Prerequisites @@ -88,8 +85,6 @@ It is _not_ recommended that you use your dbt Cloud credentials due to elevated
-
- ## Related docs - [Build your metrics](/docs/build/build-metrics-intro) diff --git a/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md b/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md index 9aea2ab42b0..f1fd13944e9 100644 --- a/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md +++ b/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md @@ -7,9 +7,6 @@ tags: [Semantic Layer] pagination_next: null --- - - - The dbt Semantic Layer allows you to define metrics and use various interfaces to query them. The Semantic Layer does the heavy lifting to find where the queried data exists in your data platform and generates the SQL to make the request (including performing joins). @@ -46,32 +43,3 @@ The following table compares the features available in dbt Cloud and source avai import SlFaqs from '/snippets/_sl-faqs.md'; - - - - - -import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; - - - -## Product architecture - -The dbt Semantic Layer product architecture includes four primary components: - -| Components | Information | Developer plans | Team plans | Enterprise plans | License | -| --- | --- | :---: | :---: | :---: | --- | -| **[dbt project](/docs/build/metrics)** | Define models and metrics in dbt Core.
*Note, we will deprecate and no longer support the dbt_metrics package. | ✅ | ✅ | ✅ | Open source, Core | -| **[dbt Server](https://github.com/dbt-labs/dbt-server)**| A persisted HTTP server that wraps dbt core to handle RESTful API requests for dbt operations. | ✅ | ✅ | ✅ | BSL | -| **SQL Proxy** | Reverse-proxy that accepts dbt-SQL (SQL + Jinja like query models and metrics, use macros), compiles the query into pure SQL, and executes the query against the data platform. | ✅

_* Available during Public Preview only_ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise) | -| **[Discovery API](/docs/dbt-cloud-apis/discovery-api)** | Accesses metric definitions primarily via integrations and is the source of truth for objects defined in dbt projects (like models, macros, sources, metrics). The Discovery API is updated at the end of every dbt Cloud run. | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise) | - - - -dbt Semantic Layer integrations will: - -- Leverage the Discovery API to fetch a list of objects and their attributes, like metrics -- Generate a dbt-SQL statement -- Then query the SQL proxy to evaluate the results of this statement - -
diff --git a/website/docs/docs/use-dbt-semantic-layer/tableau.md b/website/docs/docs/use-dbt-semantic-layer/tableau.md index 0f12a75f468..689df12ec6a 100644 --- a/website/docs/docs/use-dbt-semantic-layer/tableau.md +++ b/website/docs/docs/use-dbt-semantic-layer/tableau.md @@ -9,7 +9,6 @@ sidebar_label: "Tableau (beta)" The Tableau integration with the dbt Semantic Layer is a [beta feature](/docs/dbt-versions/product-lifecycles#dbt-cloud). ::: - The Tableau integration allows you to use worksheets to query the Semantic Layer directly and produce your dashboards with trusted data. This integration provides a live connection to the dbt Semantic Layer through Tableau Desktop or Tableau Server. diff --git a/website/vercel.json b/website/vercel.json index 3377b49278d..2329cac0941 100644 --- a/website/vercel.json +++ b/website/vercel.json @@ -2,6 +2,11 @@ "cleanUrls": true, "trailingSlash": false, "redirects": [ + { + "source": "/docs/build/metrics", + "destination": "/docs/build/build-metrics-intro", + "permanent": true + }, { "source": "/docs/cloud/dbt-cloud-ide", "destination": "/docs/cloud/dbt-cloud-ide/develop-in-the-cloud", From 8b00253997ff14a9b2f6777da8f13d374493b423 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Wed, 6 Dec 2023 12:57:02 -0500 Subject: [PATCH 02/22] updat elink --- website/docs/reference/node-selection/methods.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/website/docs/reference/node-selection/methods.md b/website/docs/reference/node-selection/methods.md index e29612e3401..1b3c5be980b 100644 --- a/website/docs/reference/node-selection/methods.md +++ b/website/docs/reference/node-selection/methods.md @@ -244,7 +244,7 @@ dbt ls --select "+exposure:*" --resource-type source # list all sources upstr ### The "metric" method -The `metric` method is used to select parent resources of a specified [metric](/docs/build/metrics). Use in conjunction with the `+` operator. +The `metric` method is used to select parent resources of a specified [metric](/docs/build/build-metrics-intro). Use in conjunction with the `+` operator. ```bash dbt build --select "+metric:weekly_active_users" # build all resources upstream of weekly_active_users metric @@ -367,4 +367,4 @@ dbt list --select semantic_model:* # list all semantic models dbt list --select +semantic_model:orders # list your semantic model named "orders" and all upstream resources ``` - \ No newline at end of file + From a2809137a9bb81f66a7bd4f2d4fa90b8b7ad9d86 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Thu, 7 Dec 2023 09:49:50 -0500 Subject: [PATCH 03/22] removing metric and metrics schema --- .../schema-discovery-job-metric.mdx | 58 ------------------ .../schema-discovery-job-metrics.mdx | 60 ------------------- website/sidebars.js | 2 - website/vercel.json | 10 ++++ 4 files changed, 10 insertions(+), 120 deletions(-) delete mode 100644 website/docs/docs/dbt-cloud-apis/schema-discovery-job-metric.mdx delete mode 100644 website/docs/docs/dbt-cloud-apis/schema-discovery-job-metrics.mdx diff --git a/website/docs/docs/dbt-cloud-apis/schema-discovery-job-metric.mdx b/website/docs/docs/dbt-cloud-apis/schema-discovery-job-metric.mdx deleted file mode 100644 index 3a8a52a19cb..00000000000 --- a/website/docs/docs/dbt-cloud-apis/schema-discovery-job-metric.mdx +++ /dev/null @@ -1,58 +0,0 @@ ---- -title: "Metric object schema" -sidebar_label: "Metric" -id: "discovery-schema-job-metric" ---- - -import { NodeArgsTable, SchemaTable } from "./schema"; - -The metric object allows you to query information about [metrics](/docs/build/metrics). - -### Arguments - -When querying for a `metric`, the following arguments are available. - - - -Below we show some illustrative example queries and outline the schema (all possible fields you can query) of the metric object. - -### Example query - -The example query below outputs information about a metric. You can also add any field from the model endpoint (the example simply selects name). This includes schema, database, uniqueId, columns, and more. For details, refer to [Model object schema](/docs/dbt-cloud-apis/discovery-schema-job-model). - - -```graphql -{ - job(id: 123) { - metric(uniqueId: "metric.jaffle_shop.new_customers") { - uniqueId - name - packageName - tags - label - runId - description - type - sql - timestamp - timeGrains - dimensions - meta - resourceType - filters { - field - operator - value - } - model { - name - } - } - } -} -``` - -### Fields -When querying for a `metric`, the following fields are available: - - diff --git a/website/docs/docs/dbt-cloud-apis/schema-discovery-job-metrics.mdx b/website/docs/docs/dbt-cloud-apis/schema-discovery-job-metrics.mdx deleted file mode 100644 index 174dd5b676a..00000000000 --- a/website/docs/docs/dbt-cloud-apis/schema-discovery-job-metrics.mdx +++ /dev/null @@ -1,60 +0,0 @@ ---- -title: "Metrics object schema" -sidebar_label: "Metrics" -id: "discovery-schema-job-metrics" ---- - -import { NodeArgsTable, SchemaTable } from "./schema"; - -The metrics object allows you to query information about [metrics](/docs/build/metrics). - - -### Arguments - -When querying for `metrics`, the following arguments are available. - - - -Below we show some illustrative example queries and outline the schema (all possible fields you can query) of the metrics object. - -### Example query - -The example query returns information about all metrics for the given job. - -```graphql -{ - job(id: 123) { - metrics { - uniqueId - name - packageName - tags - label - runId - description - type - sql - timestamp - timeGrains - dimensions - meta - resourceType - filters { - field - operator - value - } - model { - name - } - } - } -} -``` - -### Fields -The metrics object can access the _same fields_ as the [metric node](/docs/dbt-cloud-apis/discovery-schema-job-metric). The difference is that the metrics object can output a list so instead of querying for fields for one specific metric, you can query for those parameters for all metrics in a run. - -When querying for `metrics`, the following fields are available: - - diff --git a/website/sidebars.js b/website/sidebars.js index 598fffc7f0d..792645b53e7 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -556,8 +556,6 @@ const sidebarSettings = { "docs/dbt-cloud-apis/discovery-schema-job", "docs/dbt-cloud-apis/discovery-schema-job-model", "docs/dbt-cloud-apis/discovery-schema-job-models", - "docs/dbt-cloud-apis/discovery-schema-job-metric", - "docs/dbt-cloud-apis/discovery-schema-job-metrics", "docs/dbt-cloud-apis/discovery-schema-job-source", "docs/dbt-cloud-apis/discovery-schema-job-sources", "docs/dbt-cloud-apis/discovery-schema-job-seed", diff --git a/website/vercel.json b/website/vercel.json index 2329cac0941..eb057123cde 100644 --- a/website/vercel.json +++ b/website/vercel.json @@ -2,6 +2,16 @@ "cleanUrls": true, "trailingSlash": false, "redirects": [ + { + "source": "/docs/dbt-cloud-apis/discovery-schema-job-metric", + "destination": "/docs/dbt-cloud-apis/discovery-schema-environment", + "permanent": true + }, + { + "source": "/docs/dbt-cloud-apis/discovery-schema-job-metrics", + "destination": "/docs/dbt-cloud-apis/discovery-schema-environment", + "permanent": true + }, { "source": "/docs/build/metrics", "destination": "/docs/build/build-metrics-intro", From d569754aa118ac1260cdd80a3e9745c323ac6329 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Mon, 11 Dec 2023 11:37:45 -0500 Subject: [PATCH 04/22] update sidebar --- website/sidebars.js | 1 - 1 file changed, 1 deletion(-) diff --git a/website/sidebars.js b/website/sidebars.js index 792645b53e7..544b3a1e92a 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -281,7 +281,6 @@ const sidebarSettings = { "docs/build/jinja-macros", "docs/build/sources", "docs/build/exposures", - "docs/build/metrics", "docs/build/groups", "docs/build/analyses", ], From 8ee808d9145107c71abe59e8e2016ab45a4772d9 Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Mon, 11 Dec 2023 11:59:10 -0500 Subject: [PATCH 05/22] Create legacy-sl.md --- .../release-notes/74-Dec-2023/legacy-sl.md | 13 +++++++++++++ 1 file changed, 13 insertions(+) create mode 100644 website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md new file mode 100644 index 00000000000..00b1dd4b74c --- /dev/null +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md @@ -0,0 +1,13 @@ +--- +title: "Deprecation: dbt Metrics and the Legacy dbt Semantic Layer for Git repository caching" +description: "December 2023: dbt Cloud can cache your project's code (as well as other dbt packages) to ensure runs can begin despite an upstream Git provider's outage." +sidebar_label: "Deprecation: Support for Git repository caching" +sidebar_position: 10 +date: 2023-12-15 +--- + +Now available for dbt Cloud Enterprise plans is a new option to enable Git repository caching for your job runs. When enabled, dbt Cloud caches your dbt project's Git repository and uses the cached copy instead if there's an outage with the Git provider. This feature improves the reliability and stability of your job runs. + +To learn more, refer to [Repo caching](/docs/deploy/deploy-environments#git-repository-caching). + + From 9edcc6d61566ca4c9da4c615f15ac92a0af7dd1f Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Mon, 11 Dec 2023 12:24:46 -0500 Subject: [PATCH 06/22] update rn --- .../release-notes/74-Dec-2023/legacy-sl.md | 12 +++++------- 1 file changed, 5 insertions(+), 7 deletions(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md index 00b1dd4b74c..fbe5396d9a6 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md @@ -1,13 +1,11 @@ --- -title: "Deprecation: dbt Metrics and the Legacy dbt Semantic Layer for Git repository caching" -description: "December 2023: dbt Cloud can cache your project's code (as well as other dbt packages) to ensure runs can begin despite an upstream Git provider's outage." -sidebar_label: "Deprecation: Support for Git repository caching" +title: "Deprecation: dbt Metrics and the legacy dbt Semantic Layer is now deprecated" +description: "December 2023: For users on dbt v1.5 and lower, dbt Metrics and the legacy dbt Semantic Layer has been deprecated. Use the migration guide to migrate to and access the latest dbt Semantic Layer. " +sidebar_label: "Deprecation: dbt Metrics and Legacy dbt Semantic Layer" sidebar_position: 10 date: 2023-12-15 --- -Now available for dbt Cloud Enterprise plans is a new option to enable Git repository caching for your job runs. When enabled, dbt Cloud caches your dbt project's Git repository and uses the cached copy instead if there's an outage with the Git provider. This feature improves the reliability and stability of your job runs. +On December 15th, 2023, dbt Labs deprecated dbt Metrics and the legacy dbt Semantic Layer, both supported on dbt version 1.5 or lower. This deprecation means dbt Metrics and the legacy Semantic Layer will no longer be supported. We will also remove the product from the dbt Cloud user interface and [documentation site](https://docs.getdbt.com/). -To learn more, refer to [Repo caching](/docs/deploy/deploy-environments#git-repository-caching). - - +To access the [re-released dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), switch to the latest version using the dedicated [migration guide](/guides/sl-migration?step=1). From f9c7d9bd40d686fc7029321a9ca8b2c77f677430 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Mon, 11 Dec 2023 12:58:19 -0500 Subject: [PATCH 07/22] tweaks of rn --- .../release-notes/74-Dec-2023/legacy-sl.md | 25 +++++++-- .../docs/use-dbt-semantic-layer/setup-sl.md | 51 ------------------- website/snippets/sl-prerequisites.md | 38 -------------- website/snippets/sl-public-preview-banner.md | 7 --- website/snippets/sl-set-up-steps.md | 30 ----------- 5 files changed, 22 insertions(+), 129 deletions(-) delete mode 100644 website/snippets/sl-prerequisites.md delete mode 100644 website/snippets/sl-public-preview-banner.md delete mode 100644 website/snippets/sl-set-up-steps.md diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md index fbe5396d9a6..395ad258b62 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md @@ -2,10 +2,29 @@ title: "Deprecation: dbt Metrics and the legacy dbt Semantic Layer is now deprecated" description: "December 2023: For users on dbt v1.5 and lower, dbt Metrics and the legacy dbt Semantic Layer has been deprecated. Use the migration guide to migrate to and access the latest dbt Semantic Layer. " sidebar_label: "Deprecation: dbt Metrics and Legacy dbt Semantic Layer" -sidebar_position: 10 +sidebar_position: 09 date: 2023-12-15 --- -On December 15th, 2023, dbt Labs deprecated dbt Metrics and the legacy dbt Semantic Layer, both supported on dbt version 1.5 or lower. This deprecation means dbt Metrics and the legacy Semantic Layer will no longer be supported. We will also remove the product from the dbt Cloud user interface and [documentation site](https://docs.getdbt.com/). +dbt Labs has deprecated dbt Metrics and the legacy dbt Semantic Layer, both supported on dbt version 1.5 or lower. These changes will be in effect on _December 15th, 2023_. -To access the [re-released dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), switch to the latest version using the dedicated [migration guide](/guides/sl-migration?step=1). +This deprecation means dbt Metrics and the legacy Semantic Layer will no longer be supported. We will also remove the product from the dbt Cloud user interface and [documentation site](https://docs.getdbt.com/). + +### Why this change? + +We understand that changes of this nature can be disruptive and we believe in the potential of the new direction. The [re-released dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), powered by MetricFlow, is a significantly improved foundation that enables more flexible query generation, faster performance, and a more dynamic consumption experience. It’s a step towards a brighter future for dbt and its community. + +### Key changes + +- Legacy Semantic Layer and dbt Metrics will officially be deprecated on December 15th, 2023. +- MetricFlow has replaced dbt Metrics for the construction of semantic logic. The `dbt_metrics` package will no longer be supported. +- Exports replaces Materializations. Exports will be available in December or January in dbt Cloud and replaces the previous materializations functionality. + +### Actions + +- If you're using the legacy dbt Semantic Layer or dbt Metrics, use the [dbt Semantic Layer migration guide](/guides/sl-migration?step=1) to migrate over to the re-released dbt Semantic Layer. +- Engage and share feedback with the dbt Labs team and dbt Community slack using channels like [#dbt-cloud-semantic-layer](https://getdbt.slack.com/archives/C046L0VTVR6) and [#dbt-metricflow](https://getdbt.slack.com/archives/C02CCBBBR1D). Or reach out to your dbt Cloud account representative. +- Refer to some additional info and resources to help you upgrade your dbt version: + + - [Upgrade version in dbt Cloud](/docs/dbt-versions/upgrade-core-in-cloud) + - [Version migration guides](/docs/dbt-versions/core-upgrade) diff --git a/website/docs/docs/use-dbt-semantic-layer/setup-sl.md b/website/docs/docs/use-dbt-semantic-layer/setup-sl.md index fb7b853ad24..0944e5ce1e1 100644 --- a/website/docs/docs/use-dbt-semantic-layer/setup-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/setup-sl.md @@ -32,58 +32,7 @@ import SlSetUp from '/snippets/_new-sl-setup.md'; 8. You’re done 🎉! The semantic layer should is now enabled for your project. --> - - - -import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; - - - -With the dbt Semantic Layer, you can define business metrics, reduce code duplication and inconsistency, create self-service in downstream tools, and more. Configure the dbt Semantic Layer in dbt Cloud to connect with your integrated partner tool. - -## Prerequisites - - - - -## Set up dbt Semantic Layer - -:::tip -If you're using the legacy Semantic Layer, dbt Labs strongly recommends that you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt v1.6 or higher to use the latest dbt Semantic Layer. Refer to the dedicated [migration guide](/guides/sl-migration) for more info. - -::: - - * Team and Enterprise accounts can set up the Semantic Layer and [Discovery API](/docs/dbt-cloud-apis/discovery-api) in the integrated partner tool to import metric definitions. - * Developer accounts can query the Proxy Server using SQL but won't be able to browse dbt metrics in external tools, which requires access to the Discovery API. - - -1. Log in to your dbt Cloud account. -2. Go to **Account Settings**, and then **Service Tokens** to create a new [service account API token](/docs/dbt-cloud-apis/service-tokens). Save your token somewhere safe. -3. Assign permissions to service account tokens depending on the integration tool you choose. Refer to the [integration partner documentation](https://www.getdbt.com/product/semantic-layer-integrations) to determine the permission sets you need to assign. -4. Go to **Deploy** > **Environments**, and select your **Deployment** environment. -5. Click **Settings** on the top right side of the page. -6. Click **Edit** on the top right side of the page. -7. Select dbt version 1.2 or higher. -8. Toggle the Semantic Layer **On**. -9. Copy the full proxy server URL (like `https://eagle-hqya7.proxy.cloud.getdbt.com`) to connect to your [integrated partner tool](https://www.getdbt.com/product/semantic-layer-integrations). -10. Use the URL in the data source configuration of the integrated partner tool. -11. Use the data platform login credentials that make sense for how the data is consumed. - -:::info📌 - -It is _not_ recommended that you use your dbt Cloud credentials due to elevated permissions. Instead, you can use your specific integration tool permissions. - -::: - -12. Set up the [Discovery API](/docs/dbt-cloud-apis/discovery-api) (Team and Enterprise accounts only) in the integrated partner tool to import the metric definitions. The [integrated partner tool](https://www.getdbt.com/product/semantic-layer-integrations) will treat the dbt Server as another data source (like a data platform). This requires: - -- The account ID, environment ID, and job ID (which is visible in the job URL) -- An [API service token](/docs/dbt-cloud-apis/service-tokens) with job admin and metadata permissions -- Add the items above to the relevant fields in your integration tool - - -
## Related docs diff --git a/website/snippets/sl-prerequisites.md b/website/snippets/sl-prerequisites.md deleted file mode 100644 index 0c100c299b0..00000000000 --- a/website/snippets/sl-prerequisites.md +++ /dev/null @@ -1,38 +0,0 @@ - - -- Have a multi-tenant dbt Cloud instance, hosted in North America
-- Have both your production and development environments running dbt version 1.3 or higher
-- Use Snowflake data platform
-- Install the dbt metrics package version >=1.3.0, <1.4.0 in your dbt project
- * **Note** — After installing the dbt metrics package and updating the `packages.yml` file, make sure you run at least one model. -- Set up the Discovery API in the integrated tool to import metric definitions - * Developer accounts will be able to query the Proxy Server using SQL, but will not be able to browse pre-populated dbt metrics in external tools, which requires access to the Discovery API
-- Recommended - Review the dbt metrics page
- -
- - - -- Have a multi-tenant dbt Cloud instance, hosted in North America
-- Have both your production and development environments running dbt version 1.3 or higher
-- Use Snowflake data platform
-- Install the dbt metrics package version >=1.3.0, <1.4.0 in your dbt project
- * **Note** — After installing the dbt metrics package and updating the `packages.yml` file, make sure you run at least one model. -- Set up the Discovery API in the integrated tool to import metric definitions - * Developer accounts will be able to query the Proxy Server using SQL, but will not be able to browse pre-populated dbt metrics in external tools, which requires access to the Discovery API
-- Recommended - Review the dbt metrics page
- -
- - - -- Have a multi-tenant dbt Cloud instance, hosted in North America
-- Have both your production and development environments running dbt version 1.2
-- Use Snowflake data platform
-- Install the dbt metrics package version >=0.3.0, <0.4.0 in your dbt project
- * **Note** — After installing the dbt metrics package and updating the `packages.yml` file, make sure you run at least one model. -- Set up the Discovery API in the integrated tool to import metric definitions - * Developer accounts will be able to query the Proxy Server using SQL, but will not be able to browse pre-populated dbt metrics in external tools, which requires access to the Discovery API
-- Recommended - Review the dbt metrics page
- -
diff --git a/website/snippets/sl-public-preview-banner.md b/website/snippets/sl-public-preview-banner.md deleted file mode 100644 index e97527d356d..00000000000 --- a/website/snippets/sl-public-preview-banner.md +++ /dev/null @@ -1,7 +0,0 @@ -:::info 📌 - -The dbt Semantic Layer is currently available in Public Preview for multi-tenant dbt Cloud accounts hosted in North America. If you log in via https://cloud.getdbt.com/, you can access the Semantic Layer. If you log in with [another URL](/docs/cloud/about-cloud/regions-ip-addresses), the dbt Semantic Layer will be available in the future. - -For more info, review the [Prerequisites](/docs/use-dbt-semantic-layer/dbt-semantic-layer#prerequisites), [Public Preview](/docs/use-dbt-semantic-layer/quickstart-semantic-layer#public-preview), and [Product architecture](/docs/use-dbt-semantic-layer/dbt-semantic-layer#product-architecture) sections. - -::: diff --git a/website/snippets/sl-set-up-steps.md b/website/snippets/sl-set-up-steps.md deleted file mode 100644 index 295253fb994..00000000000 --- a/website/snippets/sl-set-up-steps.md +++ /dev/null @@ -1,30 +0,0 @@ - -Before you continue with the following steps, you **must** have a multi-tenant dbt Cloud account hosted in North America. - * Team and Enterprise accounts can set up the Semantic Layer and [Discovery API](/docs/dbt-cloud-apis/discovery-api) in the integrated partner tool to import metric definition. - * Developer accounts can query the Proxy Server using SQL but won't be able to browse dbt metrics in external tools, which requires access to the Discovery API. - -You can set up the dbt Semantic Layer in dbt Cloud at the environment level by following these steps: - -1. Login to your dbt Cloud account -2. Go to **Account Settings**, and then **Service Tokens** to create a new [service account API token](/docs/dbt-cloud-apis/service-tokens). Save your token somewhere safe. -3. Assign permissions to service account tokens depending on the integration tool you choose. You can review the [integration partner documentation](https://www.getdbt.com/product/semantic-layer-integrations) to determine the permission sets you need to assign. -4. Go to **Deploy** and then **Environment**, and select your **Deployment** environment. -5. Click on **Settings** on the top right side of the page. -6. Click **Edit** on the top right side of the page. -7. Select dbt version 1.2 or higher. -8. Toggle the Semantic Layer **On**. -9. Copy the full proxy server URL (like `https://eagle-hqya7.proxy.cloud.getdbt.com`) to connect to your [integrated partner tool](https://www.getdbt.com/product/semantic-layer-integrations). -10. Use the URL in the data source configuration of the integrated partner tool. -11. Use the data platform login credentials that make sense for how the data is consumed. - -:::info📌 - -Note - It is _not_ recommended that you use your dbt Cloud credentials due to elevated permissions. Instead, you can use your specific integration tool permissions. - -::: - -12. Set up the [Discovery API](/docs/dbt-cloud-apis/discovery-api) (Team and Enterprise accounts only) in the integrated partner tool to import the metric definitions. The [integrated partner tool](https://www.getdbt.com/product/semantic-layer-integrations) will treat the dbt Server as another data source (like a data platform). This requires: - -- The account ID, environment ID, and job ID (visible in the job URL) -- An [API service token](/docs/dbt-cloud-apis/service-tokens) with job admin and metadata permissions -- Add the items above to the relevant fields in your integration tool From 4113f882dd26f043edd1fd70ade408ac6fa3705c Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Mon, 11 Dec 2023 12:59:55 -0500 Subject: [PATCH 08/22] remove link as redundant --- .../docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md index 395ad258b62..11bd25a7020 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md @@ -8,7 +8,7 @@ date: 2023-12-15 dbt Labs has deprecated dbt Metrics and the legacy dbt Semantic Layer, both supported on dbt version 1.5 or lower. These changes will be in effect on _December 15th, 2023_. -This deprecation means dbt Metrics and the legacy Semantic Layer will no longer be supported. We will also remove the product from the dbt Cloud user interface and [documentation site](https://docs.getdbt.com/). +This deprecation means dbt Metrics and the legacy Semantic Layer will no longer be supported. We will also remove the product from the dbt Cloud user interface and documentation site. ### Why this change? From 36a30c0ce3d0176bf2ff7633deeb7593e2810202 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Mon, 11 Dec 2023 14:25:10 -0500 Subject: [PATCH 09/22] fix broken links --- .../docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md | 2 +- .../docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md | 2 +- .../docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md | 2 +- .../docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md | 2 +- website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md | 2 +- website/docs/guides/dbt-models-on-databricks.md | 2 +- website/docs/sql-reference/aggregate-functions/sql-count.md | 4 ++-- website/docs/terms/dry.md | 6 ++++-- 8 files changed, 12 insertions(+), 10 deletions(-) diff --git a/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md b/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md index a946bdf369b..240f0b86de3 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md +++ b/website/docs/docs/dbt-versions/core-upgrade/04-upgrading-to-v1.4.md @@ -48,7 +48,7 @@ For more detailed information and to ask any questions, please visit [dbt-core/d - [**Events and structured logging**](/reference/events-logging): dbt's event system got a makeover. Expect more consistency in the availability and structure of information, backed by type-safe event schemas. - [**Python support**](/faqs/Core/install-python-compatibility): Python 3.11 was released in October 2022. It is officially supported in dbt-core v1.4, although full support depends also on the adapter plugin for your data platform. According to the Python maintainers, "Python 3.11 is between 10-60% faster than Python 3.10." We encourage you to try [`dbt parse`](/reference/commands/parse) with dbt Core v1.4 + Python 3.11, and compare the timing with dbt Core v1.3 + Python 3.10. Let us know what you find! -- [**Metrics**](/docs/build/metrics): `time_grain` is optional, to provide better ergonomics around metrics that aren't time-bound. +- [**Metrics**](/docs/build/build-metrics-intro): `time_grain` is optional, to provide better ergonomics around metrics that aren't time-bound. - **dbt-Jinja context:** The [local_md5](/reference/dbt-jinja-functions/local_md5) context method will calculate an [MD5 hash](https://en.wikipedia.org/wiki/MD5) for use _within_ dbt. (Not to be confused with SQL md5!) - [**Exposures**](/docs/build/exposures) can now depend on `metrics`. - [**"Tarball" packages**](/docs/build/packages#internally-hosted-tarball-URL): Some organizations have security requirements to pull resources only from internal services. To address the need to install packages from hosted environments (such as Artifactory or cloud storage buckets), it's possible to specify any accessible URL where a compressed dbt package can be downloaded. diff --git a/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md b/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md index d9d97f17dc5..5a381b16928 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md +++ b/website/docs/docs/dbt-versions/core-upgrade/05-upgrading-to-v1.3.md @@ -49,7 +49,7 @@ GitHub discussion with details: [dbt-labs/dbt-core#6011](https://github.com/dbt- ## New and changed documentation - **[Python models](/docs/build/python-models)** are natively supported in `dbt-core` for the first time, on data warehouses that support Python runtimes. -- Updates made to **[Metrics](/docs/build/metrics)** reflect their new syntax for definition, as well as additional properties that are now available. +- Updates made to **[Metrics](/docs/build/build-metrics-intro)** reflect their new syntax for definition, as well as additional properties that are now available. - Plus, a few related updates to **[exposure properties](/reference/exposure-properties)**: `config`, `label`, and `name` validation. - **[Custom `node_color`](/reference/resource-configs/docs.md)** in `dbt-docs`. For the first time, you can control the colors displayed in dbt's DAG. Want bronze, silver, and gold layers? It's at your fingertips. diff --git a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md index 72a3e0c82ad..cd75e7f411b 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md +++ b/website/docs/docs/dbt-versions/core-upgrade/06-upgrading-to-v1.2.md @@ -34,7 +34,7 @@ See GitHub discussion [dbt-labs/dbt-core#5468](https://github.com/dbt-labs/dbt-c ## New and changed functionality - **[Grants](/reference/resource-configs/grants)** are natively supported in `dbt-core` for the first time. That support extends to all standard materializations, and the most popular adapters. If you already use hooks to apply simple grants, we encourage you to use built-in `grants` to configure your models, seeds, and snapshots instead. This will enable you to [DRY](https://en.wikipedia.org/wiki/Don%27t_repeat_yourself) up your duplicated or boilerplate code. -- **[Metrics](/docs/build/metrics)** now support an `expression` type (metrics-on-metrics), as well as a `metric()` function to use when referencing metrics from within models, macros, or `expression`-type metrics. For more information on how to use expression metrics, check out the [**`dbt_metrics` package**](https://github.com/dbt-labs/dbt_metrics) +- **[Metrics](/docs/build/build-metrics-intro)** now support an `expression` type (metrics-on-metrics), as well as a `metric()` function to use when referencing metrics from within models, macros, or `expression`-type metrics. For more information on how to use expression metrics, check out the [**`dbt_metrics` package**](https://github.com/dbt-labs/dbt_metrics) - **[dbt-Jinja functions](/reference/dbt-jinja-functions)** now include the [`itertools` Python module](/reference/dbt-jinja-functions/modules#itertools), as well as the [set](/reference/dbt-jinja-functions/set) and [zip](/reference/dbt-jinja-functions/zip) functions. - **[Node selection](/reference/node-selection/syntax)** includes a [file selection method](/reference/node-selection/methods#the-file-method) (`-s model.sql`), and [yaml selector](/reference/node-selection/yaml-selectors) inheritance. - **[Global configs](/reference/global-configs/about-global-configs)** now include CLI flag and environment variable settings for [`target-path`](/reference/project-configs/target-path) and [`log-path`](/reference/project-configs/log-path), which can be used to override the values set in `dbt_project.yml` diff --git a/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md index 6e437638ef6..99de91c6565 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md +++ b/website/docs/docs/dbt-versions/core-upgrade/08-upgrading-to-v1.0.md @@ -70,7 +70,7 @@ Several under-the-hood changes from past minor versions, tagged with deprecation ## New features and changed documentation -- Add [metrics](/docs/build/metrics), a new node type +- Add [metrics](/docs/build/build-metrics-intro), a new node type - [Generic tests](/best-practices/writing-custom-generic-tests) can be defined in `tests/generic` (new), in addition to `macros/` (as before) - [Parsing](/reference/parsing): partial parsing and static parsing have been turned on by default. - [Global configs](/reference/global-configs/about-global-configs) have been standardized. Related updates to [global CLI flags](/reference/global-cli-flags) and [`profiles.yml`](/docs/core/connect-data-platform/profiles.yml). diff --git a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md index 19fcc4b9eda..2ca81a0daf2 100644 --- a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md @@ -92,7 +92,7 @@ import SlFaqs from '/snippets/_sl-faqs.md'; ## Next steps -- [Set up dbt Semantic Layer](docs/use-dbt-semantic-layer/setup-dbt-sl) +- [Set up dbt Semantic Layer](/docs/use-dbt-semantic-layer/setup-dbt-sl) - [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) - Demo on [how to define and query metrics with MetricFlow](https://www.loom.com/share/60a76f6034b0441788d73638808e92ac?sid=861a94ac-25eb-4fd8-a310-58e159950f5a) - [Billing](/docs/cloud/billing) diff --git a/website/docs/guides/dbt-models-on-databricks.md b/website/docs/guides/dbt-models-on-databricks.md index 489a3c28467..907b677cff9 100644 --- a/website/docs/guides/dbt-models-on-databricks.md +++ b/website/docs/guides/dbt-models-on-databricks.md @@ -104,7 +104,7 @@ When you delete a record from a Delta table, it is a soft delete. What this mean Now onto the most final layer — the gold marts that business stakeholders typically interact with from their preferred BI tool. The considerations here will be fairly similar to the silver layer except that these marts are more likely to handling aggregations. Further, you will likely want to be even more intentional about Z-Ordering these tables as SLAs tend to be lower with these direct stakeholder facing tables. -In addition, these tables are well suited for defining [dbt metrics](/docs/build/metrics) on to ensure simplicity and consistency across your key business KPIs! Using the [dbt_metrics package](https://hub.getdbt.com/dbt-labs/metrics/latest/), you can query the metrics inside of your own dbt project even. With the upcoming Semantic Layer Integration, you can also then query the metrics in any of the partner integrated tools. +In addition, these tables are well suited for defining [metrics](/docs/build/build-metrics-intro) on to ensure simplicity and consistency across your key business KPIs! Using the [MetricFlow](https://github.com/dbt-labs/metricflow), you can query the metrics inside of your own dbt project even. With the upcoming Semantic Layer Integration, you can also then query the metrics in any of the partner integrated tools. ### Filter rows in target and/or source diff --git a/website/docs/sql-reference/aggregate-functions/sql-count.md b/website/docs/sql-reference/aggregate-functions/sql-count.md index 42ece4b124f..d65c670df90 100644 --- a/website/docs/sql-reference/aggregate-functions/sql-count.md +++ b/website/docs/sql-reference/aggregate-functions/sql-count.md @@ -60,6 +60,6 @@ Some data warehouses, such as Snowflake and Google BigQuery, additionally suppor We most commonly see queries using COUNT to: - Perform initial data exploration on a dataset to understand dataset volume, primary key uniqueness, distribution of column values, and more. - Calculate the counts of key business metrics (daily orders, customers created, etc.) in your data models or BI tool. -- Define [dbt metrics](/docs/build/metrics) to aggregate key metrics. +- Define [metrics](/docs/build/build-metrics-intro) to aggregate key metrics. -This isn’t an extensive list of where your team may be using COUNT throughout your development work, dbt models, and BI tool logic, but it contains some common scenarios analytics engineers face day-to-day. \ No newline at end of file +This isn’t an extensive list of where your team may be using COUNT throughout your development work, dbt models, and BI tool logic, but it contains some common scenarios analytics engineers face day-to-day. diff --git a/website/docs/terms/dry.md b/website/docs/terms/dry.md index b1649278cd2..ec1c9229567 100644 --- a/website/docs/terms/dry.md +++ b/website/docs/terms/dry.md @@ -42,8 +42,10 @@ Most teams have essential business logic that defines the successes and failures By writing DRY definitions for key business logic and metrics that are referenced throughout a dbt project and/or BI (business intelligence) tool, data teams can create those single, unambiguous, and authoritative representations for their essential transformations. Gone are the days of 15 different definitions and values for churn, and in are the days of standardization and DRYness. -:::note Experimental dbt Metrics! -dbt v1.0 currently supports the use of experimental metrics, time series aggregations over a table that support zero or one dimensions. Using [dbt Metrics](/docs/build/metrics), data teams can define metric calculations, ownerships, and definitions in a YAML file that lives within their dbt project. dbt Metrics are in their experimental stage; if you’re interesting in learning more about dbt Metrics, please make sure to join the #dbt-metrics-and-server channel in the [dbt Community Slack](https://www.getdbt.com/community/join-the-community/). +:::important dbt Semantic Layer, powered by MetricFlow + +The [dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), powered by [MetricFlow](/docs/build/about-metricflow), simplifies the process of defining and using critical business metrics, like revenue in the modeling layer (your dbt project). By centralizing metric definitions, data teams can ensure consistent self-service access to these metrics in downstream data tools and applications. The dbt Semantic Layer eliminates duplicate coding by allowing data teams to define metrics on top of existing models and automatically handles data joins. + ::: ## Tools to help you write DRY code From c4cdcda7ea7bb5bdbb38e9e5fd1f529445a954cb Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Mon, 11 Dec 2023 14:58:45 -0500 Subject: [PATCH 10/22] fix broken links --- website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md | 2 +- website/docs/sql-reference/aggregate-functions/sql-sum.md | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md index 2ca81a0daf2..9625621562e 100644 --- a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md @@ -92,7 +92,7 @@ import SlFaqs from '/snippets/_sl-faqs.md'; ## Next steps -- [Set up dbt Semantic Layer](/docs/use-dbt-semantic-layer/setup-dbt-sl) +- [Set up dbt Semantic Layer](/docs/use-dbt-semantic-layer/setup-sl) - [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) - Demo on [how to define and query metrics with MetricFlow](https://www.loom.com/share/60a76f6034b0441788d73638808e92ac?sid=861a94ac-25eb-4fd8-a310-58e159950f5a) - [Billing](/docs/cloud/billing) diff --git a/website/docs/sql-reference/aggregate-functions/sql-sum.md b/website/docs/sql-reference/aggregate-functions/sql-sum.md index cb9235798d2..d6ca00c2daa 100644 --- a/website/docs/sql-reference/aggregate-functions/sql-sum.md +++ b/website/docs/sql-reference/aggregate-functions/sql-sum.md @@ -57,8 +57,8 @@ All modern data warehouses support the ability to use the SUM function (and foll We most commonly see queries using SUM to: - Calculate the cumulative sum of a metric across a customer/user id using a CASE WHEN statement (ex. `sum(case when order_array is not null then 1 else 0 end) as count_orders`) -- Create [dbt metrics](/docs/build/metrics) for key business values, such as LTV +- Create [dbt metrics](/docs/build/build-metrics-intro) for key business values, such as LTV - Calculate the total of a field across a dimension (ex. total session time, total time spent per ticket) that you typically use in `fct_` or `dim_` models - Summing clicks, spend, impressions, and other key ad reporting metrics in tables from ad platforms -This isn’t an extensive list of where your team may be using SUM throughout your development work, dbt models, and BI tool logic, but it contains some common scenarios analytics engineers face day-to-day. \ No newline at end of file +This isn’t an extensive list of where your team may be using SUM throughout your development work, dbt models, and BI tool logic, but it contains some common scenarios analytics engineers face day-to-day. From f0235af5df5e5c6d7a8c3b67dedcd8e04d398a22 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Mon, 11 Dec 2023 15:29:13 -0500 Subject: [PATCH 11/22] remove version blocks --- website/snippets/_v2-sl-prerequisites.md | 30 ------------------------ 1 file changed, 30 deletions(-) diff --git a/website/snippets/_v2-sl-prerequisites.md b/website/snippets/_v2-sl-prerequisites.md index eb8b5fc27e4..99d8a945db6 100644 --- a/website/snippets/_v2-sl-prerequisites.md +++ b/website/snippets/_v2-sl-prerequisites.md @@ -1,6 +1,3 @@ - - - - Have a dbt Cloud Team or Enterprise account. Suitable for both Multi-tenant and Single-tenant deployment. - Note: Single-tenant accounts should contact their account representative for necessary setup and enablement. - Have both your production and development environments running [dbt version 1.6 or higher](/docs/dbt-versions/upgrade-core-in-cloud). @@ -11,30 +8,3 @@ - dbt Core or Developer accounts can define metrics but won't be able to dynamically query them.
- Understand [MetricFlow's](/docs/build/about-metricflow) key concepts, which powers the latest dbt Semantic Layer. - Note that SSH tunneling for [Postgres and Redshift](/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb) connections, [PrivateLink](/docs/cloud/secure/about-privatelink), and [Single sign-on (SSO)](/docs/cloud/manage-access/sso-overview) doesn't supported the dbt Semantic Layer yet. - -
- - - - -- Have a multi-tenant dbt Cloud instance, hosted in North America
-- Have both your production and development environments running dbt version 1.3 or higher
-- Use Snowflake data platform
-- Install the dbt metrics package version >=1.3.0, <1.4.0 in your dbt project
- * **Note** — After installing the dbt metrics package and updating the `packages.yml` file, make sure you run at least one model. -- Set up the Discovery API in the integrated tool to import metric definitions - * Developer accounts will be able to query the Proxy Server using SQL, but will not be able to browse pre-populated dbt metrics in external tools, which requires access to the Discovery API
- -
- - - -- Have a multi-tenant dbt Cloud instance, hosted in North America
-- Have both your production and development environments running dbt version 1.2
-- Use Snowflake data platform
-- Install the dbt metrics package version >=0.3.0, <0.4.0 in your dbt project
- * **Note** — After installing the dbt metrics package and updating the `packages.yml` file, make sure you run at least one model. -- Set up the Discovery API in the integrated tool to import metric definitions - * Developer accounts will be able to query the Proxy Server using SQL, but will not be able to browse pre-populated dbt metrics in external tools, which requires access to the Discovery API
- -
From 0408a62be7f2d9e9ef21948b9fb3def422600430 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Mon, 11 Dec 2023 15:56:28 -0500 Subject: [PATCH 12/22] tweak --- website/snippets/_new-sl-setup.md | 2 +- website/snippets/_sl-test-and-query-metrics.md | 1 - 2 files changed, 1 insertion(+), 2 deletions(-) diff --git a/website/snippets/_new-sl-setup.md b/website/snippets/_new-sl-setup.md index 18e75c3278d..c69cb23f425 100644 --- a/website/snippets/_new-sl-setup.md +++ b/website/snippets/_new-sl-setup.md @@ -8,7 +8,7 @@ You can set up the dbt Semantic Layer in dbt Cloud at the environment and projec - You must have a successful run in your new environment. :::tip -If you're using the legacy Semantic Layer, dbt Labs strongly recommends that you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt version 1.6 or newer to use the latest dbt Semantic Layer. Refer to the dedicated [migration guide](/guides/sl-migration) for details. +If you've configured the legacy Semantic Layer, dbt Labs strongly recommends that you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt version 1.6 or higher to use the latest dbt Semantic Layer. Refer to the dedicated [migration guide](/guides/sl-migration) for details. ::: 1. In dbt Cloud, create a new [deployment environment](/docs/deploy/deploy-environments#create-a-deployment-environment) or use an existing environment on dbt 1.6 or higher. diff --git a/website/snippets/_sl-test-and-query-metrics.md b/website/snippets/_sl-test-and-query-metrics.md index 2e9490f089d..b0db4bb520d 100644 --- a/website/snippets/_sl-test-and-query-metrics.md +++ b/website/snippets/_sl-test-and-query-metrics.md @@ -65,4 +65,3 @@ To streamline your metric querying process, you can connect to the [dbt Semantic - From eda9ad9438f6776358661dc41b0c80b3e84fd65f Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Tue, 12 Dec 2023 07:41:51 -0500 Subject: [PATCH 13/22] add notice --- website/docs/docs/dbt-cloud-apis/sl-api-overview.md | 8 +++----- website/docs/docs/dbt-cloud-apis/sl-graphql.md | 6 +++--- website/docs/docs/dbt-cloud-apis/sl-jdbc.md | 6 +++--- website/docs/docs/dbt-cloud-apis/sl-manifest.md | 7 +++---- .../docs/use-dbt-semantic-layer/avail-sl-integrations.md | 8 ++++++++ website/docs/docs/use-dbt-semantic-layer/dbt-sl.md | 7 +++++++ website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md | 7 +++++++ website/docs/docs/use-dbt-semantic-layer/setup-sl.md | 8 ++++++++ .../docs/docs/use-dbt-semantic-layer/sl-architecture.md | 8 ++++++++ website/snippets/_sl-deprecation-notice.md | 6 ++---- 10 files changed, 52 insertions(+), 19 deletions(-) diff --git a/website/docs/docs/dbt-cloud-apis/sl-api-overview.md b/website/docs/docs/dbt-cloud-apis/sl-api-overview.md index 3ddbf76d152..6644d3e4b8b 100644 --- a/website/docs/docs/dbt-cloud-apis/sl-api-overview.md +++ b/website/docs/docs/dbt-cloud-apis/sl-api-overview.md @@ -9,10 +9,10 @@ pagination_next: "docs/dbt-cloud-apis/sl-jdbc" -import LegacyInfo from '/snippets/_legacy-sl-callout.md'; - - +import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; + + The rapid growth of different tools in the modern data stack has helped data professionals address the diverse needs of different teams. The downside of this growth is the fragmentation of business logic across teams, tools, and workloads. @@ -57,5 +57,3 @@ plan="dbt Cloud Team or Enterprise" icon="dbt-bit"/> - - diff --git a/website/docs/docs/dbt-cloud-apis/sl-graphql.md b/website/docs/docs/dbt-cloud-apis/sl-graphql.md index b7d13d0d453..3555b211f4f 100644 --- a/website/docs/docs/dbt-cloud-apis/sl-graphql.md +++ b/website/docs/docs/dbt-cloud-apis/sl-graphql.md @@ -7,10 +7,10 @@ tags: [Semantic Layer, APIs] -import LegacyInfo from '/snippets/_legacy-sl-callout.md'; - - +import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; + + diff --git a/website/docs/docs/dbt-cloud-apis/sl-jdbc.md b/website/docs/docs/dbt-cloud-apis/sl-jdbc.md index aba309566f8..45b012c67c6 100644 --- a/website/docs/docs/dbt-cloud-apis/sl-jdbc.md +++ b/website/docs/docs/dbt-cloud-apis/sl-jdbc.md @@ -7,10 +7,10 @@ tags: [Semantic Layer, API] -import LegacyInfo from '/snippets/_legacy-sl-callout.md'; - - +import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; + + The dbt Semantic Layer Java Database Connectivity (JDBC) API enables users to query metrics and dimensions using the JDBC protocol, while also providing standard metadata functionality. diff --git a/website/docs/docs/dbt-cloud-apis/sl-manifest.md b/website/docs/docs/dbt-cloud-apis/sl-manifest.md index 6ecac495869..eefa0bfc15e 100644 --- a/website/docs/docs/dbt-cloud-apis/sl-manifest.md +++ b/website/docs/docs/dbt-cloud-apis/sl-manifest.md @@ -9,10 +9,10 @@ pagination_next: null -import LegacyInfo from '/snippets/_legacy-sl-callout.md'; - - +import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; + + dbt creates an [artifact](/reference/artifacts/dbt-artifacts) file called the _Semantic Manifest_ (`semantic_manifest.json`), which MetricFlow requires to build and run metric queries properly for the dbt Semantic Layer. This artifact contains comprehensive information about your dbt Semantic Layer. It is an internal file that acts as the integration point with MetricFlow. @@ -97,4 +97,3 @@ Top-level keys for the semantic manifest are: - [dbt Semantic Layer API](/docs/dbt-cloud-apis/sl-api-overview) - [About dbt artifacts](/reference/artifacts/dbt-artifacts) - diff --git a/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md b/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md index 045838602a9..eb05fc75649 100644 --- a/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md +++ b/website/docs/docs/use-dbt-semantic-layer/avail-sl-integrations.md @@ -9,6 +9,14 @@ meta: api_name: dbt Semantic Layer APIs --- + + +import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; + + + + + There are a number of data applications that seamlessly integrate with the dbt Semantic Layer, powered by MetricFlow, from business intelligence tools to notebooks, spreadsheets, data catalogs, and more. These integrations allow you to query and unlock valuable insights from your data ecosystem. Use the [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview) to simplify metric queries, optimize your development workflow, and reduce coding. This approach also ensures data governance and consistency for data consumers. diff --git a/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md b/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md index cde9ae4afbb..ccbef5a6639 100644 --- a/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/dbt-sl.md @@ -9,6 +9,13 @@ pagination_next: "docs/use-dbt-semantic-layer/quickstart-sl" pagination_prev: null --- + + +import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; + + + + The dbt Semantic Layer, powered by [MetricFlow](/docs/build/about-metricflow), simplifies the process of defining and using critical business metrics, like `revenue` in the modeling layer (your dbt project). By centralizing metric definitions, data teams can ensure consistent self-service access to these metrics in downstream data tools and applications. The dbt Semantic Layer eliminates duplicate coding by allowing data teams to define metrics on top of existing models and automatically handles data joins. diff --git a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md index 9625621562e..3d8cdb5a56b 100644 --- a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md @@ -8,6 +8,13 @@ meta: api_name: dbt Semantic Layer APIs --- + + +import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; + + + + import CreateModel from '/snippets/_sl-create-semanticmodel.md'; import DefineMetrics from '/snippets/_sl-define-metrics.md'; diff --git a/website/docs/docs/use-dbt-semantic-layer/setup-sl.md b/website/docs/docs/use-dbt-semantic-layer/setup-sl.md index 0944e5ce1e1..1016de1830a 100644 --- a/website/docs/docs/use-dbt-semantic-layer/setup-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/setup-sl.md @@ -6,6 +6,14 @@ sidebar_label: "Set up your Semantic Layer" tags: [Semantic Layer] --- + + +import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; + + + + + With the dbt Semantic Layer, you can centrally define business metrics, reduce code duplication and inconsistency, create self-service in downstream tools, and more. Configure the dbt Semantic Layer in dbt Cloud to connect with your integrated partner tool. ## Prerequisites diff --git a/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md b/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md index f1fd13944e9..459fcfc487f 100644 --- a/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md +++ b/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md @@ -7,6 +7,14 @@ tags: [Semantic Layer] pagination_next: null --- + + +import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; + + + + + The dbt Semantic Layer allows you to define metrics and use various interfaces to query them. The Semantic Layer does the heavy lifting to find where the queried data exists in your data platform and generates the SQL to make the request (including performing joins). diff --git a/website/snippets/_sl-deprecation-notice.md b/website/snippets/_sl-deprecation-notice.md index 19bf19c2d90..d19dd78ca15 100644 --- a/website/snippets/_sl-deprecation-notice.md +++ b/website/snippets/_sl-deprecation-notice.md @@ -1,7 +1,5 @@ :::info Deprecation of dbt Metrics and the legacy dbt Semantic Layer -For users of the dbt Semantic Layer on version 1.5 or lower — Support for dbt Metrics and the legacy dbt Semantic Layer ends on December 15th, 2023. To access the latest features, migrate to the updated version using the [dbt Semantic Layer migration guide](/guides/sl-migration). - - -After December 15th, dbt Labs will no longer support these deprecated features, they will be removed from the dbt Cloud user interface, and their documentation removed from the docs site. +dbt Labs has deprecated dbt Metrics and the legacy dbt Semantic Layer, both supported on dbt version 1.5 or lower. These changes went into effect on December 15th, 2023. +To switch over and access [MetricFlow](/docs/build/build-metrics-intro) or the re-released dbt Semantic Layer, use the [dbt Semantic Layer migration guide](/guides/sl-migration) and [upgrade your version](/docs/dbt-versions/upgrade-core-in-cloud) in dbt Cloud. ::: From a7612764497537d23987f148a4d7e52b2cbfff7b Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Tue, 12 Dec 2023 07:44:25 -0500 Subject: [PATCH 14/22] update notice --- .../docs/use-dbt-semantic-layer/quickstart-sl.md | 14 +++++++------- 1 file changed, 7 insertions(+), 7 deletions(-) diff --git a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md index 3d8cdb5a56b..665260ed9f4 100644 --- a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md @@ -8,6 +8,13 @@ meta: api_name: dbt Semantic Layer APIs --- +import CreateModel from '/snippets/_sl-create-semanticmodel.md'; +import DefineMetrics from '/snippets/_sl-define-metrics.md'; +import ConfigMetric from '/snippets/_sl-configure-metricflow.md'; +import TestQuery from '/snippets/_sl-test-and-query-metrics.md'; +import ConnectQueryAPI from '/snippets/_sl-connect-and-query-api.md'; +import RunProdJob from '/snippets/_sl-run-prod-job.md'; + import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; @@ -16,13 +23,6 @@ import DeprecationNotice from '/snippets/_sl-deprecation-notice.md'; -import CreateModel from '/snippets/_sl-create-semanticmodel.md'; -import DefineMetrics from '/snippets/_sl-define-metrics.md'; -import ConfigMetric from '/snippets/_sl-configure-metricflow.md'; -import TestQuery from '/snippets/_sl-test-and-query-metrics.md'; -import ConnectQueryAPI from '/snippets/_sl-connect-and-query-api.md'; -import RunProdJob from '/snippets/_sl-run-prod-job.md'; - The dbt Semantic Layer, powered by [MetricFlow](/docs/build/about-metricflow), simplifies defining and using critical business metrics. It centralizes metric definitions, eliminates duplicate coding, and ensures consistent self-service access to metrics in downstream tools. From 7166bb7314a80c18a584965e5d67ced6f9444dae Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Tue, 12 Dec 2023 13:03:00 -0500 Subject: [PATCH 15/22] tweak --- .../release-notes/74-Dec-2023/legacy-sl.md | 31 +++++++++++++------ 1 file changed, 21 insertions(+), 10 deletions(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md index 11bd25a7020..c1949f8dfdc 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md @@ -8,23 +8,34 @@ date: 2023-12-15 dbt Labs has deprecated dbt Metrics and the legacy dbt Semantic Layer, both supported on dbt version 1.5 or lower. These changes will be in effect on _December 15th, 2023_. -This deprecation means dbt Metrics and the legacy Semantic Layer will no longer be supported. We will also remove the product from the dbt Cloud user interface and documentation site. +This deprecation means dbt Metrics and the legacy Semantic Layer will no longer be supported. We will also remove the feature from the dbt Cloud user interface and documentation site. ### Why this change? -We understand that changes of this nature can be disruptive and we believe in the potential of the new direction. The [re-released dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), powered by MetricFlow, is a significantly improved foundation that enables more flexible query generation, faster performance, and a more dynamic consumption experience. It’s a step towards a brighter future for dbt and its community. +The [re-released dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), powered by MetricFlow, offers enhanced flexibility, performance, and user experience, marking a significant advancement for the dbt community. -### Key changes +### Key changes and impact -- Legacy Semantic Layer and dbt Metrics will officially be deprecated on December 15th, 2023. -- MetricFlow has replaced dbt Metrics for the construction of semantic logic. The `dbt_metrics` package will no longer be supported. -- Exports replaces Materializations. Exports will be available in December or January in dbt Cloud and replaces the previous materializations functionality. +- **Deprecation date** — The legacy Semantic Layer and dbt Metrics will be officially deprecated on December 15th, 2023. +- **Replacement** — [MetricFlow](/docs/build/build-metrics-intro) replaces dbt Metrics for defining semantic logic. The `dbt_metrics` package will no longer be supported post-deprecation. +- **New feature** — Exports replaces the materializations functionality and will be available in dbt Cloud in December or January. -### Actions -- If you're using the legacy dbt Semantic Layer or dbt Metrics, use the [dbt Semantic Layer migration guide](/guides/sl-migration?step=1) to migrate over to the re-released dbt Semantic Layer. -- Engage and share feedback with the dbt Labs team and dbt Community slack using channels like [#dbt-cloud-semantic-layer](https://getdbt.slack.com/archives/C046L0VTVR6) and [#dbt-metricflow](https://getdbt.slack.com/archives/C02CCBBBR1D). Or reach out to your dbt Cloud account representative. -- Refer to some additional info and resources to help you upgrade your dbt version: +### Breaking changes and recommendations + +- For users on dbt version 1.6 and lower with dbt Metrics and Snowflake proxy: + - **Impact**: Post-deprecation, queries using the proxy _will not_ run. + - **Action required:** _Immediate_ migration is necessary. Refer to the [dbt Semantic Layer migration guide](/guides/sl-migration?step=1) + +- For users on dbt version 1.6 and lower using dbt Metrics without Snowflake proxy: + - **Impact**: No immediate disruption, but the package will not receive updates or support after deprecation + - **Recommendation**: Plan migration to the new Semantic Layer for compatibility with dbt version 1.6 and higher. + +### Engage and support + +- Feedback and community support — Engage and share feedback with the dbt Labs team and dbt Community slack using channels like [#dbt-cloud-semantic-layer](https://getdbt.slack.com/archives/C046L0VTVR6) and [#dbt-metricflow](https://getdbt.slack.com/archives/C02CCBBBR1D). Or reach out to your dbt Cloud account representative. + +- Resources for upgrading —Refer to some additional info and resources to help you upgrade your dbt version: - [Upgrade version in dbt Cloud](/docs/dbt-versions/upgrade-core-in-cloud) - [Version migration guides](/docs/dbt-versions/core-upgrade) From c84a378eba2a16cb041c0443c7c8e44fcdea6a40 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Tue, 12 Dec 2023 13:03:46 -0500 Subject: [PATCH 16/22] change --- .../docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md index c1949f8dfdc..4c8d4278ef5 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md @@ -29,7 +29,7 @@ The [re-released dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), power - For users on dbt version 1.6 and lower using dbt Metrics without Snowflake proxy: - **Impact**: No immediate disruption, but the package will not receive updates or support after deprecation - - **Recommendation**: Plan migration to the new Semantic Layer for compatibility with dbt version 1.6 and higher. + - **Recommendation**: Plan migration to the re-released Semantic Layer for compatibility with dbt version 1.6 and higher. ### Engage and support From a0b2d9d4262c2e300c6c384048be75b922bb6ea7 Mon Sep 17 00:00:00 2001 From: mirnawong1 Date: Tue, 12 Dec 2023 13:09:02 -0500 Subject: [PATCH 17/22] fi bullets --- .../docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md | 2 -- 1 file changed, 2 deletions(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md index 4c8d4278ef5..e940583c602 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md @@ -34,8 +34,6 @@ The [re-released dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), power ### Engage and support - Feedback and community support — Engage and share feedback with the dbt Labs team and dbt Community slack using channels like [#dbt-cloud-semantic-layer](https://getdbt.slack.com/archives/C046L0VTVR6) and [#dbt-metricflow](https://getdbt.slack.com/archives/C02CCBBBR1D). Or reach out to your dbt Cloud account representative. - - Resources for upgrading —Refer to some additional info and resources to help you upgrade your dbt version: - - [Upgrade version in dbt Cloud](/docs/dbt-versions/upgrade-core-in-cloud) - [Version migration guides](/docs/dbt-versions/core-upgrade) From a4e83e41016851832a7107bfcd246724eabb877a Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Wed, 13 Dec 2023 07:17:50 -0500 Subject: [PATCH 18/22] Update website/snippets/_sl-deprecation-notice.md Co-authored-by: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> --- website/snippets/_sl-deprecation-notice.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/snippets/_sl-deprecation-notice.md b/website/snippets/_sl-deprecation-notice.md index d19dd78ca15..610b1574b7d 100644 --- a/website/snippets/_sl-deprecation-notice.md +++ b/website/snippets/_sl-deprecation-notice.md @@ -1,5 +1,5 @@ :::info Deprecation of dbt Metrics and the legacy dbt Semantic Layer dbt Labs has deprecated dbt Metrics and the legacy dbt Semantic Layer, both supported on dbt version 1.5 or lower. These changes went into effect on December 15th, 2023. -To switch over and access [MetricFlow](/docs/build/build-metrics-intro) or the re-released dbt Semantic Layer, use the [dbt Semantic Layer migration guide](/guides/sl-migration) and [upgrade your version](/docs/dbt-versions/upgrade-core-in-cloud) in dbt Cloud. +To migrate and access [MetricFlow](/docs/build/build-metrics-intro) or the re-released dbt Semantic Layer, use the [dbt Semantic Layer migration guide](/guides/sl-migration) and [upgrade your version](/docs/dbt-versions/upgrade-core-in-cloud) in dbt Cloud. ::: From 735edf3eada82d5628b523ff265f7cead70155fc Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Wed, 13 Dec 2023 07:18:14 -0500 Subject: [PATCH 19/22] Update website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md Co-authored-by: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> --- .../docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md index e940583c602..efb13132506 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md @@ -8,7 +8,7 @@ date: 2023-12-15 dbt Labs has deprecated dbt Metrics and the legacy dbt Semantic Layer, both supported on dbt version 1.5 or lower. These changes will be in effect on _December 15th, 2023_. -This deprecation means dbt Metrics and the legacy Semantic Layer will no longer be supported. We will also remove the feature from the dbt Cloud user interface and documentation site. +This deprecation means dbt Metrics and the legacy Semantic Layer are no longer supported. We also removed the feature from the dbt Cloud user interface and documentation site. ### Why this change? From 5a2b28ecac45fb4e347ea31bbf2aae39684e5bae Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Wed, 13 Dec 2023 07:19:24 -0500 Subject: [PATCH 20/22] Update website/snippets/_new-sl-setup.md Co-authored-by: Matt Shaver <60105315+matthewshaver@users.noreply.github.com> --- website/snippets/_new-sl-setup.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/snippets/_new-sl-setup.md b/website/snippets/_new-sl-setup.md index c69cb23f425..a02481db33d 100644 --- a/website/snippets/_new-sl-setup.md +++ b/website/snippets/_new-sl-setup.md @@ -8,7 +8,7 @@ You can set up the dbt Semantic Layer in dbt Cloud at the environment and projec - You must have a successful run in your new environment. :::tip -If you've configured the legacy Semantic Layer, dbt Labs strongly recommends that you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt version 1.6 or higher to use the latest dbt Semantic Layer. Refer to the dedicated [migration guide](/guides/sl-migration) for details. +If you've configured the legacy Semantic Layer, it has been deprecated, and dbt Labs strongly recommends that you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt version 1.6 or higher to use the latest dbt Semantic Layer. Refer to the dedicated [migration guide](/guides/sl-migration) for details. ::: 1. In dbt Cloud, create a new [deployment environment](/docs/deploy/deploy-environments#create-a-deployment-environment) or use an existing environment on dbt 1.6 or higher. From 1d66dfd2e538d6c84e3f1c1d076cf717b0145fbe Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Wed, 13 Dec 2023 07:32:16 -0500 Subject: [PATCH 21/22] Update website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md --- .../docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md index efb13132506..bc761d35e82 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md @@ -6,7 +6,7 @@ sidebar_position: 09 date: 2023-12-15 --- -dbt Labs has deprecated dbt Metrics and the legacy dbt Semantic Layer, both supported on dbt version 1.5 or lower. These changes will be in effect on _December 15th, 2023_. +dbt Labs has deprecated dbt Metrics and the legacy dbt Semantic Layer, both supported on dbt version 1.5 or lower. This change starts on December 15th, 2023. This deprecation means dbt Metrics and the legacy Semantic Layer are no longer supported. We also removed the feature from the dbt Cloud user interface and documentation site. From 808336651e2b733db9ece174a06f7a277ee84a7c Mon Sep 17 00:00:00 2001 From: mirnawong1 <89008547+mirnawong1@users.noreply.github.com> Date: Wed, 13 Dec 2023 11:28:33 -0500 Subject: [PATCH 22/22] Update website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md --- .../docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md index bc761d35e82..df4616f4d43 100644 --- a/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md +++ b/website/docs/docs/dbt-versions/release-notes/74-Dec-2023/legacy-sl.md @@ -18,7 +18,7 @@ The [re-released dbt Semantic Layer](/docs/use-dbt-semantic-layer/dbt-sl), power - **Deprecation date** — The legacy Semantic Layer and dbt Metrics will be officially deprecated on December 15th, 2023. - **Replacement** — [MetricFlow](/docs/build/build-metrics-intro) replaces dbt Metrics for defining semantic logic. The `dbt_metrics` package will no longer be supported post-deprecation. -- **New feature** — Exports replaces the materializations functionality and will be available in dbt Cloud in December or January. +- **New feature** — Exports replaces the materializing data with `metrics.calculate` functionality and will be available in dbt Cloud in December or January. ### Breaking changes and recommendations