Replies: 1 comment 1 reply
-
Hi @Naihell, I've run into the same issue recently using the My solution was to write a DML query that manually writes in the column on any record it is currently missing. I can confirm that I am able to write queries that take advantage of the partitioned data now. The query was this:
Hopefully you're able to adapt this to your use case. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey there!
I don't know if it is a silly question, but I am stuck in a problem for a few days.
I am using BigQuery export extension, but I need to backfill my table using a collection with about 900k records. My first problem happens when I try to use the 0.1.40 version, that had the option to backfill. If I try this version on small collections, everything works perfectly, but if I try the same version on my 900k records, it doesn't work. After a long time, I have no errors, but cloud function only got about 400k records, tried the same process twice.
I also tried another approach using latest extension version (0.1.43) and fs-bq-import-collection script. In this situation I got precisely all records, but I can't have my time partitioned column fulfilled.
It would be great to run backfill my data and also have my partitioned column correctly fulfilled. Any tips about how to proceed?
Beta Was this translation helpful? Give feedback.
All reactions