multiprocessing - Python - Pool.map_async only runs every other element in iterable unless chunksize=1 -


i've run across weird problem map_async. have bunch of rsync jobs, i'm trying parallelize using map_async. problem map_async runs every other element in iterable (i.e. - half input). abstraction of problem looks this:

import multiprocessing  def run_async(section):     print "running section: ", section  if __name__ == '__main__':      sections = ['alpha', 'bravo', 'charlie', 'delta', 'echo', 'foxtrot', 'golf' ]      pool = multiprocessing.pool(processes=5)     result = pool.map_async(run_async, sections)     pool.close()     pool.join() 

the result not expected , run_async works on every other element of sections, such result is:

running section: alpha running section: charlie running section: echo running section: golf 

the way overcome problem passing chuncksize=1 map_async. elements of sections passed worker process. why that?

on computer, outputted:

[amit@amit build]$ python2.7 so.py  running section:  alpha 

do know why? because there's race between child processes working , main process ending.

adding end of code:

pool.close() pool.join() 

fixed it:

[amit@amit build]$ python2.7 so.py  running section:  alpha running section:  bravo running section:  charlie running section:  delta running section:  echo running section:  golf running section:  foxtrot 

Comments

Popular posts from this blog

c# - Validate object ID from GET to POST -

node.js - Custom Model Validator SailsJS -

php - Find a regex to take part of Email -