Performing parallel ffmpeg conversions on dozens of servers simultaneously. Is there a better way?

I am currently working on a rather large site that requires several thousand files to be encoded using ffmpeg in flash format (along with creating thumbnails, encoding mobile files). Average coding works for about 5-15 minutes, as these are rather large source files. At the end of the task, the outputs should be sent to different servers (a flash file in one box, thumbnails in another, a mobile file in another, and a source file in another).

During these processes (at the moment it’s 13 steps) there is a LOT that can go wrong at any given time (bad files, stuck encodings, network latency, dying servers, peak wire), so error handling is a bitch.

Are there existing solutions for this kind of work? Or do I just need to adjust my homegrown scripts?

+4
source share
4 answers

Code would be the perfect candidate for this job. The system is designed to be reliable and fault tolerant. Plus, it's free.

, . Http://www.transcodem.com/documentation/notifications.html#custom

+4
+2

, , Gearman (. http://danga.com/gearman/): , , , (, , , , ).

+1

After searching the Internet, I came to the conclusion that there are not many solutions. I did not use it myself and it does not use PHP (uses node.js), but I found this: http://www.transcodem.com/


Then, creating it again, I would look for:

  • A decent message queue such as Redis, Beanstalkd for offline processing.
  • worthy script to handle loading.

I think you are right in handling errors, but you will have to deal with this.

0
source

All Articles