PyRFC won't allow multiprocessing?

145 Views Asked by At

I'm using PyRFC on Databricks and I'm trying to do around 20k queries to SAP. Due to the large volume of queries, I want to do this by utilizing multiprocessing. Here's what I have:

ASHOST='Some_Server_Name'
CLIENT='xx'
SYSNR='xx'
USER='xxxx'
PASSWD='xxxx'
conn = Connection(ashost=ASHOST, sysnr=SYSNR, client=CLIENT, user=USER, passwd=PASSWD)


q = Queue()

def worker(plant, material):
    print("called")
    q.put(conn.call(
        "Func_Name",
        **{"WERKS": plant, "MATNR": material.zfill(18)}
    ))

jobs = []
for i in plant_collection[:10]:
    p = Process(target=worker, args=(i["plant"], i["material"]))
    jobs.append(p)

for j in jobs:
    j.start()

print(q.get())

I'm using the multiprocessing library. The print function never gets executed and when I call q.get() it never returns back. It just hangs. What am I doing wrong and how can I fix this?

0

There are 0 best solutions below