So, the question here basically makes sense if you name the context.
BigQuery is a managed service, so your data is replicated and you trust Google Cloud to be available all the time. In the event of a failure, BigQuery engineers will cope with the situation, you will not be able to start a rollback or healing, or anything else, as soon as it works.
Suppose that all the data is somehow destroyed (a large earthquake + bombardment falls into several data centers, etc.), your data is lost if you do not have a source to build again. This is true for other data that you have under the tutelage of not only the BQ project. It is possible that your CEO credentials are used by a hacker to destroy all your backups and then your current instances, therefore all company data is lost and you cannot restore the data back to normal, since you no longer have backups.
In order to be prepared for maximum disaster when the BQ service is disconnected, or Google closes its service or something worse, you need to create a backup to restore your data. This is enough if you keep your raw files and can play them. Usually we save this, because after a few years we can move all other data to another provider, for example: Skynet Data Center, etc. :), and we can use it again to reproduce our db state.
You can export BigQuery tables and data to Google Cloud Storage, and you can move from where you want cold storage, etc ... You can also import 5 TB files if you want to recover.
source share