I got some laptops and DataLab. For various reasons, I would like to access the same data from a local Jupyter laptop on my machine.
This question suggested several approaches that so far I canβt work.
In particular, the Gcloud library:
from gcloud import bigquery client = bigquery.Client()
Give me a stack trace whose last line is:
ContextualVersionConflict: (protobuf 2.6.1 (/usr/local/lib/python2.7/dist-packages), Requirement.parse('protobuf!=3.0.0.b2.post1,>=3.0.0b2'), set(['gcloud']))
The Pandas library seems promising:
df=pd.io.gbq.read_gbq('SELECT CCS_Category_ICD9, Gender, Admit_Month FROM [xxxxxxxx-xxxxx:xxxx_100MB_newform.xxxxxx_100MB_newform]ORDER by CCS_Category_ICD9', project_id='xxxxxxxx-xxxxx')
Also gives me a stack trace:
IOError: [Errno 2] No such file or directory: '/usr/local/lib/python2.7/dist-packages/httplib2-0.9.1.dist-info/METADATA'
Perhaps I have a problem with auth in the Pandas approach, although my browser is currently Auth'd for the project? or am i lacking addiction?
Any suggestions or recommendations are appreciated.
What is the best way to access a BigQuery data source from a local Jupyter laptop?
pandas google-cloud-platform google-bigquery google-cloud-datalab
dartdog
source share