How can I optimize my FQL to avoid Facebook timeouts?

Take a simple FQL query FQL that all links shared by friends of users from yesterday, for example:

 SELECT link_id, title, url, owner, created_time FROM link WHERE created_time > strtotime('yesterday') AND owner IN ( SELECT uid2 FROM friend WHERE uid1 = me() ) LIMIT 100 

If the user has 50 friends, this will do just fine. But if the user has hundreds of friends, most often Facebook returns an error.

Options:

  • Limit friend request to 50 . Of course, this will work, but it will show the same friends every time. If you do not want to use the League chip, this is not very useful.
  • Batch requests . Create a query package using offsets and limit them to 50. Unfortunately, there are no improvements.
  • Loop It . So far this is the best I have found. Scroll through the same queries that you created for the batch query, but do this one at a time with multiple api fql query calls. But even this is amazing and misses.

How can I request Facebook appropriately to ensure successful results?

Notes:

  • I am using the latest php sdk php version 3.1.1
  • I also tried expanding the default options for timeouts to hang in the base_facebook.php file

Common timeout errors:

one.

 Fatal error: Uncaught Exception: 1: An unknown error occurred thrown in /..../facebook/php-sdk/src/base_facebook.php on line 708 

line 708 is an exception error:

 // results are returned, errors are thrown 
 if (is_array($result) && isset($result['error_code'])) { throw new FacebookApiException($result); } 

2.

 Fatal error: Uncaught CurlException: 52: SSL read: error:00000000:lib(0):func(0):reason(0), errno 104 thrown in /..../facebook/php-sdk/src/base_facebook.php on line 814 
+8
facebook facebook-fql
source share
3 answers

You should loop using the restriction / offset, as you said, or cache the list of friends in front, as suggested by puffpio.

You said that it still does not work reliably - this is because some users may have many, many links, while others may not have many. Also note that you can retrieve undisclosed data for some users. I would recommend having one replay in your loop for failed requests - it often happens that the first one will time out and the second because of recently cached data.

Finally, for posterity, I open the task of optimizing the link table to work better when it is filtered by time.

+2
source share

Some db modules do not optimize the IN keyword, or at all. They can execute the in clause for each individual result line of your query. Can you join link tables and friends instead of using IN with a subquery?

You can find this article . (Discusses performance issues with IN clauses in MySQL, and Facebook runs MySQL on the back.)

+1
source share

It’s better to cache the user's friends and periodically update them. In other words, run this query

 SELECT uid2 FROM friend WHERE uid1 = me() 

Download the user list and run

 SELECT link_id, title, url, owner, created_time FROM link WHERE created_time > strtotime('yesterday') AND owner IN (/*your user list here*/) LIMIT 100 

This way you are not doing an internal query all the time. In fact, the user's friend list does not have a high outflow speed, so you will not need to update it as often as getting links to a shared resource.

In addition, its architecture in this way will allow you to split the second query into several queries with different sets of "owner" and then using fql.multiquery to get them all at once

+1
source share

All Articles