Magento FPC Cache Warm with user groups, wget, Lesti FPC

I use Lesti FPC on the Magento website with 10 customer groups and many categories / products.

I created a shell script that reads sitemap.xml and wget each url overnight to create a site cache. This is great for guests, but when a user of a customer group logs in, they create the cache themselves (if they are the first of the day).

Does anyone know how to make a shell script that could simulate the log itself and then trawl the site? Is it even possible for a shell script to store its own session / cookie information in order to stay on the system? and if not, any other ideas?

Many thanks

+5
source share
1 answer

So, thanks to some Googling and a lot of trial and error, I found a solution that I thought I would share.

You can use WGET to store session / cookie information by storing and loading cookies. Magento has its limitations, since you need to set a session cookie before entering the system, or the script will be redirected to the "enable-cookies" page, and not to the system, so here is the script;

#!/bin/bash # Establish a session and nab the cookie wget --save-cookies cookies.txt \ http://www.yourmagentourl.co.uk/ # Post your user credentials to login and update the cookie wget --save-cookies cookies.txt \ --load-cookies cookies.txt \ --post-data 'login[username]=USERNAME&login[password]=PASSWORD' \ http://www.yourmagentourl.co.uk/customer/account/loginPost/ # Load the cookie for each page you want to WGET to maintain the session wget --load-cookies cookies.txt \ -p http://www.yourmagentourl.co.uk/some-category.html 

This is the foundation, so itโ€™s very easy to download all the URLs from the sitemap.xml file and build registered versions of the cache.

Suitable for Grafista to manage cookie information.

Happy caching!

EDIT - HOW TO ASK SHOW ORIGINAL CODE

Here is the code for cycling around the site map and loading each page to create a cache for guests. Save this as cachewarm.sh and create a cronjob to run it every night (remember to delete or release pagecache first).

 #!/bin/bash # Pixie Media https://www.pixiemedia.co.uk # Use the sitemap and reload the Page Cache by accessing each page once # wget --quiet http://YOUR-URL.co.uk/sitemap.xml --output-document - | egrep -o "http://YOUR-URL.co.uk/[^<]+" | wget -q --delete-after -i - 
+2
source

All Articles