Update module on remote ipengine when using ipython

Hoping this has a direct answer that I missed from reading documents. Below is the problem -

  • I have a module loaded on all ipengine (s) at startup
  • Since then I have made changes to the module
  • I want these changes to apply to remote ipengine (s), i.e. I want the module to reboot in all remote instances.

How can I do that?

+4
source share
3 answers

This is the answer I found, not sure if this is the best way

from IPython.parallel import Client rc = Client(profile='ssh') dview = rc[:] dview.execute('reload(<module>)', block = True) 
+1
source

You can also enable ipython startup on machines as follows:

 %px %load_ext autoreload %px %autoreload 2 

Please note that this solution and the reload call with dview.execute () have problems when new mechanisms may appear later (as when using the batch scheduler in the cluster): they are executed only on currently available engines.

Another wrinkle: you may need a deep (recursive) reboot. See This ipengine Option:

 --ZMQInteractiveShell.deep_reload=<CBool> Default: False Enable deep (recursive) reloading by default. IPython can use the deep_reload module which reloads changes in modules recursively (it replaces the reload() function, so you don't need to change anything to use it). deep_reload() forces a full reload of modules whose code may have changed, which the default reload() function does not. When deep_reload is off, IPython will use the normal reload(), but deep_reload will still be available as dreload(). 
+2
source

I ran into the same problem when I was working on a module that I wanted to test on remote machines, but I did not want to commit my changes to git and then pull the changes on the engine machines before each reboot.

There might be a better way to do this, but my solution was to write a simple helper module that simplifies the transition to progress code for engines via scp.

I will copy the usage example here:

 import IPython.parallel import ishuttle import my_module as mm # Create a client for the cluster with remote engines rc = IPython.parallel.Client(profile='remote_ssh_engines') dview = rc[:] # Create a shuttle object; the engines' working directory # is switched to '/Remote/engine/server/scratch' after its creation s = ishuttle.Shuttle(rc, '/Remote/engine/server/scratch') # Make my_module available on all engines as mm. This code scp the # module over, imports it as mm, then reloads it. s.remote_import('my_module', import_as='mm') # Apply our favourite function from our favourite module dview.apply_sync(mm.my_func, 'favourite argument for my_func') 
0
source

All Articles