Query caching in bigquery mode

I run queries on BigQUERY ON DATASETS, which are several hundred million or records.

This is faster than other solutions, however, queries take 10-30 seconds, which is not suitable for online queries.

Is there any best practice or cache / in-memory technology that is used to speed things up?

I am also considering creating pivot / aggregation tables, but then I could do it in other databases.

+4
source share
1 answer

If the request is too slow and caching works, you can save the output of your requests by specifying the destination table. Then you can read the results from the destination table using the tabledata.list api, rather than run the query again. Otherwise, I do not know any best practices that make queries faster than optimizing the queries themselves.

+2
source

All Articles