Random pagination result

I currently have a list of data that have a randomly generated order list. There is no pagination right now, so it’s easy for me to create a randomly ordered list. As my list of data grows, it becomes slower due to all the data, so pagination is an obvious solution. The problem with pagination is that I can’t arbitrarily generate an order every time the page loads, so my manager and I came to the conclusion that the list should be pre-prepared in advance and re-created every x amount of time . Now the problem is how do we save this generated list? There are four options that we came up with:

  • Session (lesson on the server).
  • Cookies (more data is being transmitted. Think of the thousands of integer values ​​passed to the user)
  • Flat file (implementation may take a little time. Not too much, but a little longer than the rest)
  • (The cron job will run as little as x time and do a bulk update for all records. We fear that if there is too much data, it may slow down the system if people click on the server during the update.)

If there are any other solutions that seem to be better than pre-created time-based lists, I'd love to hear about them.

Thanks.

UPDATE: the answer that I really liked, but was deleted for some reason, was that someone mentioned using SEED, then I can save the seed instead of the list of identifiers, which will shorten my data store and simplify everything. I just tested the solution and it works almost flawlessly. The only problem is that when I use LIMIT everything will get messed up. Does anyone have any suggestions? I don’t want to generate all the data every time, I just want to use LIMIT * ,. If I use this with a seed, the numbers always reset, as it should be.

Hope that made sense. It made more sense as I thought about it than how it came about.

+6
html php
source share
6 answers

Mysql RAND () takes a seed as an optional argument. Using the seed, it will return the same randomized result set each time.

What you can do is generate random seed in PHP to request the first page and pass it to each page using the query string.

Edit: Sorry, I did not understand that the solution has already been published, but has been deleted.

+4
source share

Use # 4, perhaps only by storing the identifiers of the data to retrieve in the order in which they should be received.

Better if this is possible (since it looks like a scaling issue) is to pre-generate the data for each page. For example, if it is viewed only through a browser, just go ahead and pre-create the x-number of static HTML pages (or only part of the table / list).

I know this probably sounds funny without further explanation, but think about one of these options.

In addition, if the server accepts this large number of hits during the generation of the results, it must be separated from the web server and possibly run the report / generation on the clone / replicated slave of the source database.

0
source share

Use a subquery. That way you can still use the RAND() -trick, but OFFSET won't mess you up.

 select * from (select * from items order by rand(3) asc) as b limit 5 offset @x; 

Alternative tip: upgrade your MySQL. It was an old mistake.


Example:

 mysql> select * from (select id from items order by rand(3) asc) as b limit 5 offset 1; +-------+ | id | +-------+ | 24621 | | 25214 | | 27119 | | 24672 | | 25585 | +-------+ 5 rows in set (0.01 sec) mysql> select * from (select id from items order by rand(3) asc) as b limit 5 offset 2; +-------+ | id | +-------+ | 25214 | | 27119 | | 24672 | | 25585 | | 27718 | +-------+ 5 rows in set (0.01 sec) mysql> select * from (select id from items order by rand(3) asc) as b limit 5 offset 3; +-------+ | id | +-------+ | 27119 | | 24672 | | 25585 | | 27718 | | 25457 | +-------+ 5 rows in set (0.00 sec) mysql> select * from (select id from items order by rand(3) asc) as b limit 5 offset 4; +-------+ | id | +-------+ | 24672 | | 25585 | | 27718 | | 25457 | | 27112 | +-------+ 5 rows in set (0.01 sec) mysql> select * from (select id from items order by rand(3) asc) as b limit 5 offset 5; +-------+ | id | +-------+ | 25585 | | 27718 | | 25457 | | 27112 | | 24779 | +-------+ 5 rows in set (0.02 sec) 
0
source share

I prefer a random file, watch how this caching class is written from opencart:

 <?php final class Cache { private $expire = 3600; public function __construct() { $files = glob(DIR_CACHE . 'cache.*'); if ($files) { foreach ($files as $file) { $time = substr(strrchr($file, '.'), 1); if ($time < time()) { unlink($file); } } } } public function get($key) { $files = glob(DIR_CACHE . 'cache.' . $key . '.*'); if ($files) { foreach ($files as $file) { $handle = fopen($file, 'r'); $cache = fread($handle, filesize($file)); fclose($handle); return unserialize($cache); } } } public function set($key, $value) { $this->delete($key); $file = DIR_CACHE . 'cache.' . $key . '.' . (time() + $this->expire); $handle = fopen($file, 'w'); fwrite($handle, serialize($value)); fclose($handle); } public function delete($key) { $files = glob(DIR_CACHE . 'cache.' . $key . '.*'); if ($files) { foreach ($files as $file) { unlink($file); } } } } ?> 

really easy to use and it works so well, you use a random request and save your data in a file, I am sending an example.

 $cache = new cache(); $data = $cache->get('my_query_key'); if (!$data) { // I do my query and I put it into an array (because I can use shuffle :P) $result = mysql_query('SELECT * FROM items'); $data = array(); while($row = mysql_fetch_assoc($result)) { $data[] = $row; } $cache->set('my_query_key', $data); } shuffle($data); 

The only thing that may occur when saving more than 100 kilobytes of file, but, as rumor has it, I use it and work fine so that I have no problems. Ah .. in this case you do not need to use RAND () to query .: P

I am writing this message without checking sintax, be ware ^^

0
source share

the use of seeds is the best solution as stated above ...

using the following logic, you can "spoil" your data for each individual visitor (using your IP address), and also save your pagination results for a good amount of navigation time ...

 //generate individual seed... $ip=$_SERVER['REMOTE_ADDR']; $hour=date("H"); $day=date("j"); $month=date("n"); $ip=str_replace(".","",$ip); $seed=($ip+$hour+$day+$month); //query $query="SELECT * FROM my_table ORDER BY RAND($seed) LIMIT $amount OFFSET $offset"; 

Pros: fast and simple, no extra workload, no cron required, a unique randomness for each visitor.

cons: data is restarted if the hour (has faded), the day or month changes during navigation.

0
source share

Here, my working solution based on Kevin answers:

 public function galleries(){ if (!Input::has('page')){ //it means we are on the first page $seed = rand(1, 10); //generate a random number between 1 and 10 Session::put('seed', $seed); //then pass it to session named 'seed' }else{ if (Input::get('page') == 1){ //this also indicate the first page, because page = 1 $seed = rand(1, 10); Session::put('seed', $seed); } } $orgs = Organization::orderByRaw("RAND(".Session::get('seed').")")->paginate(4); //put the seed value to RAND(), this way, the second page will generate random row with same seed as the first page return redirect('galleries')->with('orgs', $orgs); } 

}

Greetings.

Edit: this is a laravel based project, but you get the idea.,

0
source share

All Articles