Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐛 [firestore-bigquery-export] Import Script Throws 'Request Entity Too Large' #2132

Open
larstbone opened this issue Jul 16, 2024 · 1 comment
Assignees
Labels
extension: firestore-bigquery-export Related to firestore-bigquery-export extension type: bug Something isn't working

Comments

@larstbone
Copy link

larstbone commented Jul 16, 2024

[REQUIRED] Step 2: Describe your configuration

  • Extension name: firestore-bigquery-export
  • Extension version: 0.1.51
  • Configuration values (redact info where appropriate):

BigQuery Dataset location->us
BigQuery Project ID->xxxxxxxxxxxxxxxxx
Collection path->xxxxxxxxxxxxxxxxxxx
Enable Wildcard Column field with Parent Firestore Document IDs (Optional)->false
Dataset ID->firestore_events
Table ID->events
BigQuery SQL table Time Partitioning option type (Optional)->NONE
BigQuery Time Partitioning column name (Optional)->Parameter not set
Firestore Document field name for BigQuery SQL Time Partitioning field option (Optional)->Parameter not set
BigQuery SQL Time Partitioning table schema field(column) type (Optional)->omit
BigQuery SQL table clustering (Optional)->Parameter not set
Maximum number of synced documents per second (Optional)->100
Backup Collection Name (Optional)->Parameter not set
Transform function URL (Optional)->Parameter not set
Use new query syntax for snapshots->yes
Exclude old data payloads (Optional)->yes
Use Collection Group query (Optional)->no
Cloud KMS key name (Optional)->Parameter not set

[REQUIRED] Step 3: Describe the problem

We cannot import existing data that may be up to 900KB.

Steps to reproduce:

  1. Using the extension configured on some collection
  2. Reconfigure the extension by setting EXCLUDE_OLD_DATA set to true
  3. When reconfiguring, confirm DO_BACKFILL is no longer available
  4. Create a document that is 900KB
  5. Confirm the document does not sync and fails with error task size too large
  6. Run the script fs-bq-import-collection to try to import existing data
  7. Confirm error Request Entity Too Large

During installation of the extension cannot backfill existing data because DO_BACKFILL is currently disabled by #2005.

Expected result
  1. DO_BACKFILL is available during both installation and reconfiguration of the extension
  2. script fs-bq-import-collection does not throw Request Entity Too Large error if we are ignoring old data by setting EXCLUDE_OLD_DATA flag to true
Actual result
  1. DO_BACKFILL is not available when the extension is installed nore when it is reconfigured
  2. script fs-bq-import-collection fails with Request ENtity too Large
@larstbone larstbone added the type: bug Something isn't working label Jul 16, 2024
@pr-Mais pr-Mais added the extension: firestore-bigquery-export Related to firestore-bigquery-export extension label Jul 18, 2024
@nikcaryo-super
Copy link

any updates on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
extension: firestore-bigquery-export Related to firestore-bigquery-export extension type: bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants