Paramiko multiprocessor module

I am trying to use the paramiko python module (1.7.7.1) to simultaneously execute commands and / or xfer files for a group of remote servers. One task looks like this:

jobs = []   
for obj in appObjs:
    if obj.stop_app:
        p = multiprocessing.Process(target=exec_cmd, args=(obj, obj.stop_cmd))
        jobs.append(p)
        print "Starting job %s" % (p)
        p.start()

"obj" contains, among other things, paramiko SSHClient, transport, and SFTPClient. The appObjs list contains approximately 25 of these objects and, therefore, 25 connections to 25 different servers.

I get the following error: paramiko transport.py in backtrace

raise AssertionError("PID check failed. RNG must be re-initialized after fork(). 
Hint:   Try Random.atfork()")

I fixed / usr / lib / python 2.6 / site-packages / paramiko / transport.py based on the post in https://github.com/newsapps/beeswithmachineguns/issues/17 , but it doesn't seem to help. I checked that Transport.py in the above path is the one that is used. The paramiko mailing list seems to have disappeared.

paramiko / ? - ? ,

+5
2

: @ento, forked ssh paramiko, , Paramiko.

Paramiko, Paramiko ( ​​ 1.7.7.1), ssh package pypi ( 1.7.11 ).

-, mainline Paramiko, , @bitprophet, Fabric, forked Paramiko ssh pypi. , , , ; gory details, . a >

+5

Paramiko, RNG , ssh . paramiko . script ( ):

#!/usr/bin/env python
# -*- coding: utf-8 -*-

import ssh
from multiprocessing import Pool
import getpass

hostnames = [HOST1, HOST2]
user = USERNAME
pw = getpass.getpass("Enter ssh password:")

def processFunc(hostname):
    handle = ssh.SSHClient()
    handle.set_missing_host_key_policy(ssh.AutoAddPolicy())
    handle.connect(hostname, username=user, password=pw)
    print("child")
    stdin, stdout, stderr = handle.exec_command("ls -l /var/log; sleep 5")
    cmdOutput = ""
    while True:
        try:
            cmdOutput += stdout.next()
        except StopIteration:
            break
    print("Got output from host %s:%s" % (hostname, cmdOutput))
    handle.close()

pool = Pool(len(hostnames))
pool.map(processFunc, hostnames, 1)
pool.close()
pool.join()

## If you want to compare speed:
# for hostname in hostnames:
#     processFunc(hostname)
+1

All Articles