I have a Silverstripe website that deals with very big data. I created an API that returns a very large dump, and I call this API in the ajax get interface.
When ajax calls the API, it takes 10 minutes to return the data (very long the json data and the client accepted this).
While they are waiting for data to return, they open the same site in a different tab to do other things, but the site is very slow until the previous ajax request completes.
Is there anything I can do to avoid being immune to expecting json big data?
Here is the code and an explanation of what it does:
I created a method called geteverything , which is located on a web server, as shown below, it accesses another server (data server) to receive data through the streaming API (sitting on the data server). There is a lot of data, and the data server is slow; my client does not mind long requests, they think how slowly everything else becomes. Sessions are used to determine the details of the request.
protected function geteverything($http, $id) { if(($System = DataObject::get_by_id('ESM_System', $id))) { if(isset($_GET['AAA']) && isset($_GET['BBB']) && isset($_GET['CCC']) && isset($_GET['DDD'])) { $request = "http://dataserver/streaming?method=xxx"; set_time_limit(120); $jsonstring = file_get_contents($request); echo($jsonstring); } } }
How can I fix this or what else do you need to know to help?
source share