Python Multiprocess not terminate

5.9k Views Asked by At

I am new to python multiprocess and I want to understand why my code does not terminate (maybe zombi or deadlock) and how to fix it. The createChain functions execute a for loop also and returns a tuple: (value1, value2). Inside createChain function there are other calls to other functions. I don't think posting the createChain function code will help because inside that function I am not doing something regarding multiprocess. I tried to make the processes as deamon but still didn't work. The strange think is that if I decrease the value of maxChains i.e to 500 or 100 is working.

I just want the process to do some heavy tasks and put the results to a data type.

My version of python is 2.7

def createTable(chainsPerCore, q, chainLength):

    for chain in xrange(chainsPerCore):
         q.put(createChain(chainLength, chain))


def initTable():
    maxChains = 1000
    chainLength = 10000
    resultsQueue = JoinableQueue()
    numOfCores = cpu_count()
    chainsPerCore = maxChains / numOfCores

    processes = [Process(target=createTable, args=(chainsPerCore, resultsQueue, chainLength,)) for x in range(numOfCores)]

    for p in processes:
        # p.daemon = True
        p.start()

    # Wait for hashing cores to finish
    for p in processes:
        p.join()

    resultsQueue.task_done()

    temp = [resultsQueue.get() for p in processes]
    print temp
1

There are 1 best solutions below

0
On

Based on the very useful comments of Tadhg McDonald-Jensen I understood better my needs and how the Queues are workings and for what purpose they should be used.

I change my code to

def initTable(output):
    maxChains = 1000

    results = []

    with closing(Pool(processes=8)) as pool:
        results = pool.map(createChain, xrange(maxChains))
        pool.terminate()