There is no built-in mechanism for storing cookies between scrapy runs, but you can create it yourself (source code to demonstrate the idea, not tested):
Step 1: Write Cookies to a File.
Get the cookie from the Set-Cookie response header in your parsing function. Then just convert it to a file.
There are several ways to explain this here: Accessing the session file in scrapy spiders
I prefer a direct approach:
cookies = ";".join(response.headers.getlist('Set-Cookie'))
cookies = cookies.split(";")
cookies = { cookie.split("=")[0]: cookie.split("=")[1] for cookie in cookies }
Ideally, this should be done with the last answer your scraper gets. Serialize the cookies that come with each response into the same file, overwriting the cookies that you serialized while processing the previous answers.
Step 2: Reading and Using Cookies from a File
, , "cookie":
def start_requests(self):
old_cookies
return Request(url, cookies=old_cookies, ...)