I have large csv files and excel files where I read them and dynamically create a create script table depending on the fields and types that it has. Then paste the data into the created table.
I read this one and realized that I had to send them with jobs.insert() instead of tabledata.insertAll() for a lot of data.
This is what I call it (works for small files not large).
result = client.push_rows(datasetname,table_name,insertObject)
When I use the push_rows library, it gives this error in windows.
[Errno 10054] An existing connection was forcibly closed by the remote host
and this is in ubuntu.
[Errno 32] Broken pipe
So, when I went through BigQuery-Python , it uses table_data.insertAll() .
How can I do this using this library? I know that we can download through the Google repository, but I need a direct download method with this.
source share