BASH CURL: do not close the connection between requests when starting sequentially

I am trying to write a BASH command that uses CURL to send a GET request to two different web pages, but uses the same connection. For me it is like sending a GET request to the login page for authentication on the server, and then the second request simulates an automatic redirect to the home page that would occur in a web browser (using the meta refresh tag). I need to link the requests because the contents of the home page (generated by the server) will be different for the guest user than the authenticated user.

I tried this command first based on the recommendation of the SOF message (suppose the $IP and $PORT variables were already defined with valid values):

 curl -u user:pass ${IP}:${PORT}/login.php && curl ${IP}:${PORT}/index.php 

However, I always get something like this happening between the end of the first GET and the beginning of the second:

 * Connection #0 to host 10.0.3.153 left intact * Closing connection #0 

So was the SOF post wrong? In any case, the execution of this command will successfully maintain a connection between two requests:

 curl -u user:pass ${IP}:${PORT}/login.php ${IP}:${PORT}/index.php 

However, I would prefer a solution closer to the first team than the last team. The main reason is to separate the output from the first page compared to the second page into two different output files. So I want to do something like:

 curl page1.html > output1 && curl page2.html > output2 

Of course, I need to reuse the same connection, because the contents of page2.html is up to me and makes a request to page1.html in the same HTTP session.

I am also open to solutions that use netcat or wget, BUT NOT PHP!

+6
source share
3 answers

Running curl a.html && & curl b.html will necessarily use two TCP (http) connections to retrieve the data. Each curling operation is its own process and will open its own connection.

However, the website does not use a TCP / HTTP connection to track login information. Instead, a token is placed in a session (usually using a cookie), which is transmitted in subsequent requests to the site. The site checks this token on subsequent requests.

Curl has a -c option indicating where cookies should be stored between connections

 curl -c cookiejar -u user:pass login.php && curl -c cookierjar index.php 

will be closer. I speak closer because many sites do not use http-based authentication, supported by the -u option, but instead use custom forms, and secondly, calls involve the use of cookies (as opposed to embedding something in javascript or a URL ) The latter, most likely, but I will not count on the first bit.

+5
source

According to curl manual , synopsis is as follows:

curl [options] [URL...]

This means that you can specify multiple URLs one by one in one command. Curl will reuse a handle for each subsequent URL:

curl will try to reuse connections for multiple file transfers, so getting a large number of files from the same server will not make multiple connections / handshakes. It improves speed. Of course, this is done only for files specified on the same command line , and cannot be used between separate curl calls.

+2
source

Basically, this is what I did for Xidel , you can record all requests and actions in one command call, and it will behave like a browser that stores cookies and the connection is alive:

 xidel http://${IP}/login.php --download page1.html -f '"index.php"' --download page2.html 

Or, if there is a link from the first page to the second, it can directly follow this link:

 xidel http://${IP}/login.php --download page1.html -f //a --download page2.html 

However, it does not support HTTP authentication or other ports than 808080 and 443 (the backend will support it, but there is a URL check between them that rejects it as an invalid URL)

0
source

All Articles