Decoding gzip-ed response organs with url-retrieve

For the Emacs extension, I would like to receive data via HTTP. I'm not particularly keen on going around things like wget , curl or w3m to do this, so I use the url-retrieve function.

One of the HTTP servers I'm talking to ignores Accept-Encoding headers and insists on always sending its data using Content-Encoding: gzip .

As a result of this and the fact that url-retrieve does not automatically decode response bodies, the url-retrieve buffer will represent me, it will contain gzip binary data.

I am looking for a way to decode a response body, preferably a piece with a piece, as data arrives. Is there any way to give url-retrieve command to do this for me?

Decrypting the response as soon as it fully arrived would also be acceptable, but I would prefer to avoid all the fubars involved in creating the asynchronous subprocess that executes gzip, part of the response to the received answer that I received, and reading the decoded pieces back - I would searched for library function here.

+6
emacs elisp
source share
1 answer

What auto-compression-mode does is run gzip in a file, which should be uncompressed. See for example jka-compr-insert-file-contents in jka-compr.el . Therefore, if you intend to use auto-compression-mode to perform compression, you must first write the response to the file. For example, something like this:

 (defun uncompress-callback (status) (let ((filename (make-temp-file "download" nil ".gz"))) (search-forward "\n\n") ; Skip response headers. (write-region (point) (point-max) filename) (with-auto-compression-mode (find-file filename)))) (url-retrieve "http://packages.ubuntu.com/hardy/allpackages?format=txt.gz" #'uncompress-callback) 

(If you do not want to create a temporary file, you will have your own subprocess control, but this is not as difficult as you mean in your question.)

+4
source share

All Articles