Credential error while integrating Google Drive with

I use Google Big Query, I want to integrate Google Big Query into Google Drive. In the Big Request, I give the URL of the Google extension sheet to download my data. It updates well, but when I write a query in Google Add-on (OWOX BI Big Query Reports):

Select * from [datasetName.TableName] 

I get an error message:

Request error: tableUnavailable: there are no suitable credentials for accessing Google Drive. Contact the owner of the table for help.

+2
source share
4 answers

I just ran into the same problem in some code that I wrote - it may not help you right here, because it looks like you are not responsible for the code, but it can help someone else or you can ask the person who writes the code you use to read this :-)

So I had to do a couple of things:

  • Enable the Drive API for my Google Cloud Platform project in addition to BigQuery.
  • Make sure your BigQuery client is created with both the BigQuery scope and the Drive scope.
  • Make sure the Google Sheets you want to access BigQuery will be shared with the account "... @ appspot.gserviceaccount.com", which your Google Cloud Platform identifies itself with.

After that, I was able to successfully query the Google Sheets support tables from BigQuery in my own project.

+2
source

It was previously said:

  • Make sure your BigQuery dataset is also shared with the service account that you will use for authentication.
  • Make sure your combined Google leaflet is also shared with the service account.
  • Api drive must also be active
  • When using OAuthClient, you need to enter both fields for Drive and BigQuery

If you write Python:

  1. credentials = GoogleCredentials.get_application_default () (cannot enter scopes # I have not found a way: D at least

  2. Create your query from scratch:

    scopes = (' https://www.googleapis.com/auth/drive.readonly ', ' https://www.googleapis.com/auth/cloud-platform ')

     credentials = ServiceAccountCredentials.from_json_keyfile_name( '/client_secret.json', scopes) http = credentials.authorize(Http()) bigquery_service = build('bigquery', 'v2', http=http) query_request = bigquery_service.jobs() query_data = { 'query': ( 'SELECT * FROM [test.federated_sheet]') } query_response = query_request.query( projectId='hello_world_project', body=query_data).execute() print('Query Results:') for row in query_response['rows']: print('\t'.join(field['v'] for field in row['f'])) 
+1
source

This probably has the same root cause as: BigQuery credential issues accessing the Google Sheets spreadsheet

Accessing federated tables in Drive requires additional OAuth scopes, and your tool can only query a large query scope. Try contacting your seller to update their app?

0
source

If you use pd.read_gbq () like me, then this will be the best place to get your answer: https://github.com/pydata/pandas-gbq/issues/161#issuecomment-433993166

 import pandas_gbq import pydata_google_auth import pydata_google_auth.cache # Instead of get_user_credentials(), you could do default(), but that may not # be able to get the right scopes if running on GCE or using credentials from # the gcloud command-line tool. credentials = pydata_google_auth.get_user_credentials( scopes=[ 'https://www.googleapis.com/auth/drive', 'https://www.googleapis.com/auth/cloud-platform', ], # Use reauth to get new credentials if you haven't used the drive scope # before. You only have to do this once. credentials_cache=pydata_google_auth.cache.REAUTH, # Set auth_local_webserver to True to have a slightly more convienient # authorization flow. Note, this doesn't work if you're running from a # notebook on a remote sever, such as with Google Colab. auth_local_webserver=True, ) sql = """SELECT state_name FROM 'my_dataset.us_states_from_google_sheets' WHERE post_abbr LIKE 'W%' """ df = pandas_gbq.read_gbq( sql, project_id='YOUR-PROJECT-ID', credentials=credentials, dialect='standard', ) print(df) 
0
source

All Articles