Do not embed data in Google App Engine local storage

I follow an example from the Google Mobile Shopping Assistant example , which asks me to import data according to this link .

I tried the steps according to the example (all the code samples are vanilla, I didn’t change anything except fix the warning in order to use the latest version of Gradle)

I believe that I am missing an important step that is not specified in this example. Can someone give some information about what steps I did wrong?

follow these steps:

  • run the local googleAppEngine backend application
  • run cmd "appcfg.py upload_data --config_file bulkloader.yaml --url = http: // localhost: 8080 / remote_api --filename places.csv --kind = Place -e myEmailAddress@gmail.com".

    This command should insert 2 rows in the data store (places.csv contains two entries)

  • it gave me the following testimony
10:07 AM Uploading data records. [INFO ] Logging to bulkloader-log-20151020.100728 [INFO ] Throttling transfers: [INFO ] Bandwidth: 250000 bytes/second [INFO ] HTTP connections: 8/second [INFO ] Entities inserted/fetched/modified: 20/second [INFO ] Batch Size: 10 Error 302: --- begin server output --- --- end server output --- 
  1. Then I go to http: // localhost: 8080 / admin / buildsearchindex ", which displays the message" MaintenanceTasks completed. "

  2. Next, I go to http: // localhost: 8080 / _ah / admin ", but displays

Datastore has no objects in an empty namespace. You need to add data

before you can use this tool to view and edit it.

enter image description here

+7
google-app-engine
source share
3 answers

So, here is what I found through testing. At first I went through the same steps and got the same error, but the output INFO client.py:669 access_token expired is worth noting:

 MobileAssistant-Data> appcfg.py upload_data --config_file bulkloader.yaml --url=http://localhost:8080/remote_api --filename places.csv --kind=Place -e myEmailAddress@gmail.com 05:12 PM Uploading data records. [INFO ] Logging to bulkloader-log-20151112.171238 [INFO ] Throttling transfers: [INFO ] Bandwidth: 250000 bytes/second [INFO ] HTTP connections: 8/second [INFO ] Entities inserted/fetched/modified: 20/second [INFO ] Batch Size: 10 2015-11-12 17:12:38,466 INFO client.py:669 access_token is expired. Now: 2015-11-12 22:12:38.466000, token_expiry: 2015-11-06 01:33:21 Error 302: --- begin server output --- 

It looked like an issue that I saw in the Remote API handler for the dev server that appeared after ClientLogin was deprecated (but in the Python SDK). Just for verification, I edited build.gradle to use the latest SDK (1.9.28 from 1.9.18) and ran it again:

 MobileAssistant-Data> appcfg.py upload_data --config_file bulkloader.yaml --url=http://localhost:8080/remote_api --filename places.csv --kind=Place -e myEmailAddress@gmail.com 05:16 PM Uploading data records. [INFO ] Logging to bulkloader-log-20151112.171615 [INFO ] Throttling transfers: [INFO ] Bandwidth: 250000 bytes/second [INFO ] HTTP connections: 8/second [INFO ] Entities inserted/fetched/modified: 20/second [INFO ] Batch Size: 10 2015-11-12 17:16:15,177 INFO client.py:669 access_token is expired. Now: 2015-11-12 22:16:15.177000, token_expiry: 2015-11-06 01:33:21 2015-11-12 17:16:15,565 INFO client.py:669 access_token is expired. Now: 2015-11-12 22:16:15.565000, token_expiry: 2015-11-06 01:33:21 2015-11-12 17:16:15,573 INFO client.py:571 Refreshing due to a 401 (attempt 1/2) 2015-11-12 17:16:15,575 INFO client.py:797 Refreshing access_token 2015-11-12 17:16:16,039 INFO client.py:571 Refreshing due to a 401 (attempt 2/2) 2015-11-12 17:16:16,040 INFO client.py:797 Refreshing access_token ... (ad infinitum) 

This conclusion now accurately reflects the remote Python API problem. Thus, it seems that the same problem exists with the Java Remote API (authentication has not been properly updated to use the new auth scheme).

The workaround in Python was to manually edit the SDK source code and complete the authentication check. I suspect that a similar hack will be needed for the Java SDK. This is not so simple, although the SDK will need to be rebuilt from the source.

If I run into anything else, I will update this answer with my findings. Note that this should work fine with the deployed application - this only affects the dev server.

Update:

The fault is the administrator check in com / google / apphosting / utils / remoteapi / RemoteApiServlet.java , as well as in the same problem in the Python SDK mentioned above. Unfortunately, you cannot trivially rebuild the SDK from the source code, since the build target in the build.xml file includes only "jsr107cache", and the rest of the build is done from pre-generated binary files. It looks like we just have to wait until this is fixed in a future version, but for now I am updating a publicly available bug.

Currently, I recommend sticking to the documentation and using only the deployed version of the application to download remote_api.

+4
source share

I had the same problem, but not with the local development server, but with the deployed version. After almost unsuccessfully, I found a workaround for loading data using appcfg. In my case, I noticed that when trying to do the following

The input does not work for me:

 gcloud auth login appcfg.py upload_data --config_file bulkloader.yaml --url=http://<yourproject>.appspot.com/remote_api --filename places.csv --kind=Place --email=<you@gmail.com> 

Error output:

 11:10 AM Uploading data records. [INFO ] Logging to bulkloader-log-20160108.111007 [INFO ] Throttling transfers: [INFO ] Bandwidth: 250000 bytes/second [INFO ] HTTP connections: 8/second [INFO ] Entities inserted/fetched/modified: 20/second [INFO ] Batch Size: 10 Error 302: --- begin server output --- --- end server output --- 

as expected, I was not asked to authenticate again during the second command, but apparently appcfg was still unable to authenticate my account. I am using Win7 with Python 2.7, the Python SDK for the Google App Engine, including appcfg.py and gcloud from the Google Cloud SDK, if I am right.

However, at https://cloud.google.com/container-registry/docs/auth it is shown that you can print the access token using the gcloud command and then insert it manually into the appcfg command, which worked for me

Login for me:

 gcloud auth login gcloud auth print-access-token 

This displays an access token that you can use with appcfg

 appcfg.py upload_data --oauth2_access_token=<oauth2_access_token> --config_file bulkloader.yaml --url=http://<yourproject>.appspot.com/remote_api --filename places.csv --kind=Place --email=<you@gmail.com> 

Conclusion of a successful data download:

 10:42 AM Uploading data records. [INFO ] Logging to bulkloader-log-20160108.104246 [INFO ] Throttling transfers: [INFO ] Bandwidth: 250000 bytes/second [INFO ] HTTP connections: 8/second [INFO ] Entities inserted/fetched/modified: 20/second [INFO ] Batch Size: 10 [INFO ] Opening database: bulkloader-progress-20160108.104246.sql3 [INFO ] Connecting to <yourproject>.appspot.com/remote_api [INFO ] Starting import; maximum 10 entities per post . [INFO ] 3 entities total, 0 previously transferred [INFO ] 3 entities (4099 bytes) transferred in 1.7 seconds [INFO ] All entities successfully transferred 

Hope this helps anyone trying to solve this problem. It is not clear to me what the source of this problem is, but it is a workaround that works for me.

By the way, I noticed the same problem on the poppy.

+12
source share

It’s better to use the new interface in the Google Developer Console. URL: https://console.developers.google.com/project/<YOUR_PROJECT_ID>/datastore

You can see subsections of the queries or indexes of this for your views [you can also use GQL] and indexes.

enter image description here

NOTE I also noticed that a special view is not displayed in the query section if some data is not added to it [GQL also returns empty data]. However, I see this particular view in the index section.

-one
source share

All Articles