You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
DROP TABLE IF EXISTS `<proj_id>`.`<dataset>`.`<destination_table>`;
CREATE TABLE `<proj_id>`.`<dataset>`.`<destination_table>`` CLONE `<proj_id>`.`<dataset>`.`<destination_table>`;
So it looks like it still runs staging-optimized ?
Expected behavior
I expect the query to look more like
INSERT INTO `{destination_table}` ({columns})
SELECT {values}
FROM `{staging_table}` s;
Steps to reproduce
run a pipeline with a bigquery target and set "replace_strategy": "staging-optimized", and then look at they queries that are run in your project history.
Operating system
macOS
Runtime environment
Docker, Docker Compose
Python version
3.12
dlt data source
sql_table
dlt destination
Google BigQuery
Other deployment details
Na
Additional information
The workaround would be to first make dlt write out to a staging dataset and then run something like
this query yourself in some function afterwards CREATE OR REPLACE TABLE{destination_table}AS SELECT * FROM{staging_table} s;
Which I guess could work, but would be nicer if it worked as expected :)
The text was updated successfully, but these errors were encountered:
loveeklund-osttra
changed the title
bigquery insert-from-staging replace strategy doesn't work
bigquery insert-from-staging replace strategy uses staging-optimized pattern
Mar 19, 2025
dlt version
1.6.1
Describe the problem
Hey, I'm loading data into bigquery and I want to use the
https://dlthub.com/docs/general-usage/full-loading#the-insert-from-staging-strategy
insert from staging strategy.
But when looking at the query that gets run on bigquery it runs
So it looks like it still runs staging-optimized ?
Expected behavior
I expect the query to look more like
Steps to reproduce
run a pipeline with a bigquery target and set
"replace_strategy": "staging-optimized",
and then look at they queries that are run in your project history.Operating system
macOS
Runtime environment
Docker, Docker Compose
Python version
3.12
dlt data source
sql_table
dlt destination
Google BigQuery
Other deployment details
Na
Additional information
The workaround would be to first make dlt write out to a staging dataset and then run something like
this query yourself in some function afterwards
CREATE OR REPLACE TABLE
{destination_table}AS SELECT * FROM
{staging_table}s;
Which I guess could work, but would be nicer if it worked as expected :)
The text was updated successfully, but these errors were encountered: