Flask: passing the background work of a worker (rq, redis)

I want to do a very simple thing: start the worker on something, and then return the answer to the user. I am trying to do this using a combination of Flask and RQ.

import os from flask import Flask, session from somewhere import do_something from rq import Queue from worker import conn app = Flask(__name__) app.debug = True app.secret_key = '....' q = Queue(connection=conn) @app.route('/make/') def make(): job = q.enqueue(do_something, 'argument') session['job'] = job return 'Done' @app.route('/get/') def get(): try: session['job'].refresh() out = str(session['job'].result) except: out = 'No result yet' return out 

The idea in this very simple example is that people go to / make / and the task begins. After a while, you can go to / get / and the result from the employee will be printed there.

However, one line causes problems:

 session['job'] = job 

It seems that the work cannot be pickled, which is apparently used by the Flaks session. I get an error message:

 ... 10:52:16 web.1 | File "/Users/julius/twitter-sentiment/venv/lib/python2.7/site-packages/flask/app.py", line 804, in save_session 10:52:16 web.1 | return self.session_interface.save_session(self, session, response) 10:52:16 web.1 | File "/Users/julius/twitter-sentiment/venv/lib/python2.7/site-packages/flask/sessions.py", line 205, in save_session 10:52:16 web.1 | secure=secure, domain=domain) 10:52:16 web.1 | File "/Users/julius/twitter-sentiment/venv/lib/python2.7/site-packages/werkzeug/contrib/securecookie.py", line 329, in save_cookie 10:52:16 web.1 | data = self.serialize(session_expires or expires) 10:52:16 web.1 | File "/Users/julius/twitter-sentiment/venv/lib/python2.7/site-packages/werkzeug/contrib/securecookie.py", line 235, in serialize 10:52:16 web.1 | self.quote(value) 10:52:16 web.1 | File "/Users/julius/twitter-sentiment/venv/lib/python2.7/site-packages/werkzeug/contrib/securecookie.py", line 192, in quote 10:52:16 web.1 | value = cls.serialization_method.dumps(value) 10:52:16 web.1 | File "/Users/julius/twitter-sentiment/venv/bin/../lib/python2.7/copy_reg.py", line 70, in _reduce_ex 10:52:16 web.1 | raise TypeError, "can't pickle %s objects" % base.__name__ 10:52:16 web.1 | TypeError: can't pickle function objects 

I really hope something can help. I could have done it completely wrong (with passing the task through the session), but I have no idea how else to access the result of the task ...

Any help would be greatly appreciated.

Thanks in advance.

+8
python flask backgroundworker session redis
source share
2 answers

I have not used rq before, but I see that the job has the .key property. It might be easier to keep this hash in your session. Then you can use the Job class .fetch , which will call .refresh() and return the job to you. Reading .result() at this point will give you the current state of the job.

Maybe this (untested):

 from rq.job import Job @app.route('/make/') def make(): job = q.enqueue(do_something, 'argument') session['job'] = job.key return 'Done' @app.route('/get/') def get(): try: job = Job() job.fetch(session['job']) out = str(job.result) except: out = 'No result yet' return out 
+3
source share

The problem is with serializing the arguments (you are actually trying to serialize a function object that is not possible with pickle ).

Try

 @app.route('/make/') def make(): job = q.enqueue(func=do_something, args=('argument',)) session['job'] = job return 'Done' 
+2
source share

All Articles