Skip to content

Releases: dbt-labs/metricflow

v0.130.0

19 Oct 20:43
9d9d10c
Compare
Choose a tag to compare

Highlights

Introducing query support for dbt metrics!

With this release you can now use MetricFlow to run queries against your dbt metrics config! If you wish to use the MetricFlow toolchain to query your dbt metrics you can now do this with a simple configuration change. To use, reinstall Metricflow with the appropriate dbt package (see below for supported installations) and make sure the following is in your .metricflow/config.yaml:

model_path: /path/to/dbt/project/root
dbt_repo: true

From there you can use all of Metricflow's tools to query that model!

Our supported installations can be added as follows:

  1. BigQuery: pip install metricflow[dbt-bigquery]
  2. Postgres: pip install metricflow[dbt-postgres]
  3. Redshift: pip install metricflow[dbt-redshift]
  4. Snowflake: pip install metricflow[dbt-snowflake]

Packaging changes

  • Use of the new dbt integration requires installation of extended package dependencies. These should not be pulled in by default.
  • Developers experiencing module not found errors with dbt models will need to run the expanded installation via poetry install -E dbt-<data_warehouse> where <data_warehouse> is one of the supported extras noted above.

Added

Updated

  • Internal refactor to use the more appropriate MetricReferences as lookup keys in place of MetricSpec classes. (@WilliamDee)

v0.120.0

13 Oct 23:54
Compare
Choose a tag to compare

Highlights

We now support Databricks! If you use Databricks for your metrics data warehousing needs, give it a go! You'll need your server hostname, http path, and access token (see the databricks documentation on where to find these). You'll need to use the keys dwh_host, dwh_http_path, and dwh_access_token for these properties in your data warehouse configuration.

We have also fixed some bugs and improved support for semi-additive measures, and added the ability to use measure-specific constraints inside of more complex metric definitions!

Breaking Changes

  • Minor API change on SqlClient protocol could break existing API users writing custom SqlClient classes. MyPy should detect this issue. See the relevant PR for details.

Added

  • Support for Databricks! Now you can use Metricflow with your Databricks-backed warehouse! (@courtneyholcomb)
  • The ability to define constraints on input measures for ratio and expr metric types. This is a temporary quality of life improvement until full-featured derived metrics (i.e., metrics based on other metrics) are available. (@tlento)
  • Support for different time granularities, improved group by expression handling, and corrected constraint handling on semi-additive measures (@WilliamDee)
  • Support for count as a measure aggregation type. Note this is implemented as an alias around sum, so use of the DISTINCT keyword in expressions is not supported, and will be blocked via validation in a separate update. Users wishing for a COUNT(DISTINCT thing) equivalent should continue to use the count_distinct aggregation type. (@WilliamDee)

Fixed

  • Resolved incorrect constraint handling with semi-additive measures (@WilliamDee)
  • Eliminated Dataclass deserialization errors on default parameters (@tlento)

Updated

  • Optimized multi-hop join candidate selection (@plypaul)
  • Improved error handling and error messages on config validation (@QMalcolm, @tlento)
  • Streamlined our project README (@nhandel)
  • CLI now accepts metric-only queries in limited circumstances. See the assertion checks in the PR for details. We will announce this as a new feature when it is complete, but for the time being users may try it out to see if it meets their needs in its current form. (@WilliamDee)

v0.111.1

30 Aug 21:54
1b8f491
Compare
Choose a tag to compare

Highlights

We now have a basic extension for authoring configs available for Visual Studio Code, with simple support for auto-completion and inline schema validation, courtesy of @jack-transform !

This release also features several bugfixes and improvements, most notably a dramatic speedup (as much as 10x) in plan building for complex queries.

Changes:

Added

  • Metric model template for Shopify metrics - if you use Shopify, this template can get you up and running for tracking your Shopify metrics! (@hesham-nawaz)

Fixed

  • Fixed warehouse validation failures caused by interaction with measure-specific aggregation times (@QMalcolm)
  • Resolved intermittent semantic validation failures due to unpicklable pydantic objects (@tlento)
  • Fixed edge case with semi-additive measures failing on certain join dimension connections (@WilliamDee)
  • Restructured semantic container depedencies to eliminate certain classes of circular import errors (@tlento)

Updated

  • Replaced Pydantic objects with serializable Dataclasses to improve plan building speed. This can reduce time to generate dataflow plans by 10-20x for complex queries. (@plypaul)
  • Refactored validation logic to make it easier to reason about (@QMalcolm)
  • Modified inference detection rules to make it more likely to classify low cardinality string columns as categorical dimensions (@tlento)

v0.111.0

11 Aug 19:58
9438f31
Compare
Choose a tag to compare

Metricflow 0.111.0 was released today!

The big news is we have added data source inference as a Beta feature. It allows new users to quickly get started with Metricflow by bootstrapping their data source configurations from a warehouse connection.

Please note that this is still Beta feature. As such, it should not be expected to be free of bugs, and its CLI/Python interfaces might change without prior notice.

For full details see our release notes.

[0.111.0] - 2022-08-10

Added

  • Early beta for data source inference - Snowflake users can now run a command to bootstrap data source definitions based on warehouse metadata for tables in a given schema. Check out mf infer --help for more details. Feedback welcome! (@serramatutu)
  • Support for semi-additive measures - semi-additive measures are measures which can be aggregated across some dimensions, but not others. The most common use case for this is an end-of-period measure, such as a statement balance or end of day inventory hold. As of this release we support semi-additive restrictions bound by any time dimension. See the description on PR #183 for more details on usage. (@WilliamDee)
  • Expanded data warehouse validations - data warehoue validation runs will now check to ensure all measures, identifiers, and dimensions defined in data sources are queryable. (@QMalcolm)

Fixed

  • Resolved serialization error on custom Pydantic objects in the model parser (@tlento)
  • Cleaned up error message handling and missing test depencies on model validation (@tlento)
  • Eliminated a class of circular import errors by forking reference classes from the specs module (@WilliamDee)
  • Fixed error state when specifying a where constraint with metric_time (@plypaul)

Updated

  • Updated the tutorial to use metric_time instead of arbitrary time dimensions (@plypaul)
  • Increased strictness of typechecking enforcement (@tlento)
  • Removed unnecessary "all time" WHERE clause rendering for cases where time constraints were not provided (@WilliamDee)
  • Updated Click version from ^8.1.3 to >=7.1.2 to temporarily resolve dependency issue for downstream Flask 1.1.4 usage (@jack-transform)
  • Updated Jinja2 version from 2.11.0 to >=2.11.0 to allow downstream Flask 1.1.4 users to update to 2.x.x (@jpreillymb, @tlento)

v0.110.1

25 Jul 21:11
4bce6d0
Compare
Choose a tag to compare

Patch fixes and minor improvements on v0.110.0:

  • Fixing a bug where columns would occasionally render in the wrong order (@WilliamDee)
  • Package metadata cleanup to fix broken links on pypi (@tlento)
  • Rendering BETWEEN expressions to streamline SQL time range comparisons (@WilliamDee)
  • Consolidating error detection in validation to surface all errors in a single pass (@QMalcolm)

Get the latest here: https://pypi.org/project/metricflow/0.110.1/

v0.110.0

22 Jul 17:20
fc2326b
Compare
Choose a tag to compare

Metricflow 0.110.0 was released today!

The big news is we have added support for setting custom time dimensions on a measure-by-measure basis. This change also means we no longer require the primary time dimensions to have the same names across all data sources.

To run a query for a time series metric computation, simply request the metric_time dimension, and Metricflow will use the aggregation time dimension associated with each measure in your query and line everything up on your time series chart for you. You don't even need to know the names of the time dimensions! Please note metric_time is now a reserved name.

Speaking of reserved names, we have also improved our validation against SQL reserved keywords, which should provide more rapid feedback in the metric config development workflow.

For full details see our release notes.

[0.110.0] - 2022-07-21

Breaking Changes

  • Updated query inputs for time series dimensions to use metric_time instead of dimension names, since it is now possible for measures to have different time dimensions for time series aggregation. This also removes the restriction that all data sources have the same primary time dimension name. However, users issuing queries might experience exceptions if they are not using metric_time as their requested primary time dimension. (@plypaul)
  • Added enforcement for new reserved keyword metric_time (@tlento)
  • Reordered column output to time dimension, identifiers, dimensions, metrics, which could break automation relying on order-dependent parsing of CLI output. We encourage affected users to switch to using the API, and to access the resulting data frame with order-independent (i.e., by name) access to column values. (@WilliamDee)
  • Removed support for SQLite - expected impact is minimal as the repo has been cut to DuckDB for in memory testing (@plypaul)

Added

  • Support for specifying measure-specific time dimensions for time series aggregations (@plypaul)
  • Validation against use of SQL reserved keywords as element names, which previously would have resulted in SQL errors at query time (@QMalcolm)