I have many TBs of approximately 1 million tables in a single BigQuery project located in several datasets located in the USA. I need to transfer all this data to datasets hosted in the EU. What is my best option for this?
- I would export tables to Google Cloud Storage and reimport using boot tasks, but the load limit is 10 thousand per project per day
- I would do this as w / "allow big results" queries and save the destination table, but this does not work in the cross scope
The only option I see now is to re-insert all the data using the BQ streaming API, which will be prohibitive.
What is the best way to move large amounts of data in many cross-section tables in BigQuery?
source
share