Replies: 6 comments 10 replies
-
|
Asking here per request: As this uncouples core and adapters, that seems like a pretty big shift. Why is this not version 2.0.0? Or, if for internal reasons it must be 1.8.0, are adapters free to make their release that is compatible with this into version 2.0.0? I want to signal to customers that they can't just take latest without making the change, particularly in environments where they get latest by default, like in Databricks Workflows. |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
|
How do we decide the full spectrum of SQL types and yaml types to put in the type tests? Is it basically, here are all the SQL types that my database supports, and here are all the ways I expect users might want to specify it? Is there any standardized way to communicate this mapping to users? |
Beta Was this translation helpful? Give feedback.
-
|
@dataders , test_unit_testing test is generating a nested CTE, which is not supported in Fabric yet. |
Beta Was this translation helpful? Give feedback.
-
|
Also, I added render_limited() as limit key word is not supported by t-sql. It looks like the default implementation from dbt-core is picked up instead of using the adapter implementation. |
Beta Was this translation helpful? Give feedback.
-
|
Hi. Curious to know if anyone successfully implemented dbt-core (v1.8.6) unit testing with dbt-athena (v1.8.4) for iceberg tables. I tried running the unit test, but the test is attempting to create a hive table instead. |
Beta Was this translation helpful? Give feedback.


Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Overview
This discussion is for communicating to adapter maintainers the scope of work needed to make use of the changes coming via dbt-core 1.8.0. If you have questions and concerns, please ask them here for posterity.
Please consider this a living document between now and the date of final release. If there's something missing, please comment below!
Loom video overview (12 min)
TBD
release timeline
The below table gives the milestones between up to and including the final release. It will be updated with each subsequent release.
prior maintainer upgrade guides
Example Diffs
TL;DR
This upgrade is a big deal, and should be the last time in a while that dbt Labs asks maintainers for this amount of work. There's two logical pieces of work here, that, in theory, aren't especially complex:
Decoupled Dependency on dbt-core
Context
Excerpt from #9171
How to implement
Everything you need to know is in dbt-adapters#87. If you have a question or concern, please ask it there.
Unit Testing
Unit testing is a long-awaited feature that is finally coming to dbt. It is a powerful feature that will allow users to test their models in isolation from the rest of their project. This should not require a great deal of work on the part of the adapter maintainer, but, more importantly, how dbt is used changes, so it's important that we have test coverage for these new scenarios.s
support for
--emptyflag to enable "dry run" modeHow to implement
--emptysupportdbt-core#8971 added a new BaseRelation method:
render_limited(). Effectively, this method will wrap a model'sSELECTstatement into a subqueryIf your data platform supports
LIMITclauses, you have no work to do. However, some SQL dialects (e.g. T-SQL) do not supportLIMITclauses. In this case, you will need to implementrender_limited()for your adapter.--emptyTestsdbt.tests.adapter.empty.test_empty.BaseTestEmptymacros
CASTandSAFE_CASTsupportdbt does a vast amount of type casting behind the scenes such as:
agate(our current csv reader) <> PythonYAML<> PythonIn the case of unit testing, we needed to extend further the
yaml<>pythoncasting to allow for more specific definition of mock data.theoretically this should be a no-op for most adapters, but it's worth checking to make sure that the
CASTandSAFE_CASTmacros are supported in your adapter. For example, dbt-spark now has asafe_castthat it did not before (dbt-spark#files).Tests within
dbt.tests.adapter.unit_testingFor unit testing, there are a handful (3) of functional tests worth implementing as an adapter to ensure both baseline functionality and expected behavior when mocking inputs with various types:
test_types.BaseUnitTestingTypesdata_typesfixture (example)data_typesis a list of (sql_value,yaml_value) wheresql_valueshould be a literal in the upstream 'real' input, andyaml_valueis what the value looks like in yaml when it is being mocked out by the userunitmaterialization makes use ofsafe_castto cast the user-provided yaml value to the expected input type while building fixture CTEssafe_castdefers tocastif the adapter does not support safe casting to a particular type (e.g. snowflake'ssafe_castdoes not support variant)safe_castandcastmay need to be extended as appropriate to support a fuller range of inputs that can be expected from the user in the context of specifying unit test input fixtures.test_case_insensitivity.BaseUnitTestCaseInsensivityNote
there is a misspelling,
InsensivitynotInsensitivityget_fixture_sql, which I'd advise against as those provide the main adapter framework/entrypoints of the unit testing functionality and our 1p implementations have not had to.test_invalid_input.BaseUnitTestInvalidInputformat_rowmacro which is what provides this functionality in the default implementation.Additional Tests
Important
These are new tests introduced into the adapter zone that you should have in your adapter.
TBD
Materialized Views Refactor
When the
1.7upgrade guide was originally published (October 2023), the "Materialized Views Refactor" was stubbed out. In December it was more fully fleshed out.If your warehouse supports materialized views, you should check it out. Even if you don't the changes implemented represent a vision of the future for how materializations are handled in dbt.
Beta Was this translation helpful? Give feedback.
All reactions