How to shuffle jobs in a Resque queue?

I have a queue called check_integrity and lots of jobs. When I start a worker for him, he first performs tasks in the first order. Is it possible to shuffle jobs in this particular queue? I need a worker to do work at random. Please, help.

Thank.

+5
source share
4 answers

Take a look at this plugin for Resque. I think this is exactly what you need.

+1
source

If you do not mind reinstalling the monkey, then you can use this solution:

module Resque

  # Monkey patch Resque to handle queues as sets instead of lists. This allows
  # use to get jobs randomly rather then sequentially.
  def push(queue, item)
   watch_queue(queue)
   redis.sadd "queue:#{queue}", encode(item)
  end

  def pop(queue)
    decode redis.spop("queue:#{queue}")
  end

  def size(queue)
    redis.scard("queue:#{queue}").to_i
  end
end

, , .

0

One way to do this is to pull the entries out of the queue, dispense them, shuffle the packet, and then reinsert them:

key = "resque:queue:bulk"
total = Redis.current.llen(key)
batch_size = 5_000 # any value that is good enough for you

batch = []
total.times do |i|
  entry = Redis.current.lpop(key)
  batch << entry
  if batch.size == batch_size
    puts "re-inserting batch..."
    Redis.current.rpush key, batch.shuffle
    batch = []
  end
end

This is really useful when you mistakenly queue up several jobs that end up racing for shared resources, locks, etc.

0
source

you can use   delayed_job

-2
source

All Articles