I had the same problem, but not with the local development server, but with the deployed version. After almost unsuccessfully, I found a workaround for loading data using appcfg. In my case, I noticed that when trying to do the following
The input does not work for me:
gcloud auth login appcfg.py upload_data --config_file bulkloader.yaml --url=http://<yourproject>.appspot.com/remote_api --filename places.csv --kind=Place --email=<you@gmail.com>
Error output:
11:10 AM Uploading data records. [INFO ] Logging to bulkloader-log-20160108.111007 [INFO ] Throttling transfers: [INFO ] Bandwidth: 250000 bytes/second [INFO ] HTTP connections: 8/second [INFO ] Entities inserted/fetched/modified: 20/second [INFO ] Batch Size: 10 Error 302: --- begin server output --- --- end server output ---
as expected, I was not asked to authenticate again during the second command, but apparently appcfg was still unable to authenticate my account. I am using Win7 with Python 2.7, the Python SDK for the Google App Engine, including appcfg.py and gcloud from the Google Cloud SDK, if I am right.
However, at https://cloud.google.com/container-registry/docs/auth it is shown that you can print the access token using the gcloud command and then insert it manually into the appcfg command, which worked for me
Login for me:
gcloud auth login gcloud auth print-access-token
This displays an access token that you can use with appcfg
appcfg.py upload_data --oauth2_access_token=<oauth2_access_token> --config_file bulkloader.yaml --url=http://<yourproject>.appspot.com/remote_api --filename places.csv --kind=Place --email=<you@gmail.com>
Conclusion of a successful data download:
10:42 AM Uploading data records. [INFO ] Logging to bulkloader-log-20160108.104246 [INFO ] Throttling transfers: [INFO ] Bandwidth: 250000 bytes/second [INFO ] HTTP connections: 8/second [INFO ] Entities inserted/fetched/modified: 20/second [INFO ] Batch Size: 10 [INFO ] Opening database: bulkloader-progress-20160108.104246.sql3 [INFO ] Connecting to <yourproject>.appspot.com/remote_api [INFO ] Starting import; maximum 10 entities per post . [INFO ] 3 entities total, 0 previously transferred [INFO ] 3 entities (4099 bytes) transferred in 1.7 seconds [INFO ] All entities successfully transferred
Hope this helps anyone trying to solve this problem. It is not clear to me what the source of this problem is, but it is a workaround that works for me.
By the way, I noticed the same problem on the poppy.