How to execute asynchronous code in Twisted Klein?

I have two functions in my python twisted Klein web service:

@inlineCallbacks def logging(data): ofile = open("file", "w") ofile.write(data) yield os.system("command to upload the written file") @APP.route('/dostuff') @inlineCallbacks def dostuff(): yield logging(data) print "check!" returnValue("42") 

When os.system("command to upload the written file") launched, a message will appear in it with the message "start downloading" and then "download full". I want the logging function to be asynchronous, so processing in the logging handler happens after the dostuff handler prints "check!". (I actually want the processing to happen after returnValue ("42"), but both of these functions perform the async function, I think?)

I thought the yield statement would make it non-blocking, but it doesn't seem like that, "check!" always printed after "starting download" and "full download". I would appreciate it if anyone can give me some feedback on this, as I am new to asynchronous coding and locked out for a while ...

+6
source share
2 answers

To make your asynchronous code, you must use Twisted Deferreds as described here . Deferrals give you an API to execute asynchronous code, they allow you to attach callbacks to your functions, and they execute code in a Twisted event loop controlled by a reactor object.

I see two possible ways to use Deferred in your case.

1) Running a task in the background using reactor.callLater()

This is normal if the dostuff handler dostuff not care about the result. You can use reactor.callLater () . This way your async function will execute after returning the value from dostuff .

So something like this:

 from klein import run, route, Klein from twisted.internet import defer, task, reactor import os app = Klein() def logging(data): ofile = open("file", "w") ofile.write(data) result = os.system("ls") print(result) @route('/') def dostuff(request): reactor.callLater(0, logging, "some data") print("check!") return b'Hello, world!' run("localhost", 8080) 

The order of events with this code is as follows, first a "check" is printed, then the response "hello world" is returned and, finally, the async call ends and prints the results of running os.system() .

 2016-08-11 08:52:33+0200 [-] check! 2016-08-11 08:52:33+0200 [-] "127.0.0.1" - - [11/Aug/2016:06:52:32 +0000] "GET / HTTP/1.1" 200 13 "-" "curl/7.35.0" a.py file 

2) Run the task in the background and get the result using task.deferLater()

If you care about the results of your registration function, you can also attach a callback to this object and use the twisted.internet.task API. If you want to go this way, you need to reorganize the handler to work as follows

 @route('/') def dostuff(request): def the_end(result): print("executed at the end with result: {}".format(result)) dfd = task.deferLater(reactor, 0, logging, "some data") dfd.addCallback(the_end) print("check!") return b'Hello, world!' 

Thus, the order of events will be the same as above, but the_end function will be executed at the end after the logging function completes.

 2016-08-11 08:59:24+0200 [-] check! 2016-08-11 08:59:24+0200 [-] "127.0.0.1" - - [11/Aug/2016:06:59:23 +0000] "GET / HTTP/1.1" 200 13 "-" "curl/7.35.0" a.py file 2016-08-11 08:59:24+0200 [-] executed at the end with result: some result 
+3
source

The yield operation does not make events asynchronous. It simply cancels the execution of the function containing it, and returns a generator object, which can later be used to iterate over the sequence.

So dostuff () will return a generator object. Nothing will happen until this generator object is repeated later. But there is nothing in your code for this to happen. I expect your dostuff procedure to result in a syntax error, because it contains both profitability and non-empty return. The logging procedure will do nothing, because it contains profitability and the returned generator is never used.

Finally, the logging procedure is about to trim its output file each time it is called because it opens the log file with the 'w' mode on every call.

For asynchronous execution, you need some form of multiprocessing. But I do not think this is necessary in this context. Your logging function is fairly easy and should work quickly and not interfere with work with dofs.

I suggest trying something like this:

 @inlineCallbacks def logging(data): try: logging._ofile.write(data + '\n') except AttributeError: logging._ofile = open("file", 'w') logging._ofile.write(data + '\n') @APP.route('/dostuff') @inlineCallbacks def dostuff(): logging("before!") os.system("command to upload the written file") logging("after!") return("42") 

Here we open the log file only once, the first log session is called when _ofile is not defined as an attribute of logging. On subsequent calls, logging._ofile will already be open, and the write statement in the try block will succeed.

The routine dostuff () calls logging to indicate that we are going to do this work, actually does the job, then calls logging to indicate that the job is done, and finally returns the required value.

+1
source

All Articles