One of buttons in my qt gui starts ProcessPoolExecutor which is making tifs images. I want add process bar but my gui is freezing on time when the event loop is working.
@pyqtSlot()
def on_pb_start_clicked(self):
if self.anaglyph_output != None and self.tmp_output != None:
progress_value = 1
self.progress_display(progress_value)
df = self.mask_df
df_size = 10
workers = self.cpu_selected
args = [(i, df[df.fid == i + 1.0], self.anaglyph_output, self.tmp_output) for i in range(df_size)]
tasks = []
with ProcessPoolExecutor(max_workers=workers) as executor:
for arg in args:
tasks.append(asyncio.get_event_loop().run_in_executor(executor, create_anaglyph, *arg))
loop = asyncio.get_event_loop().run_until_complete(asyncio.gather(*tasks))
print(loop)
else:
self.display_no_path_warning()
i dont know what to do
In this code you are gaining nothing by using async code - and, by using it incorrectly, you are actually blocking the execution that would be parallel in the sub-processes, until all processing is finished.
asyncioin Python allow for concurrent tasks, but then everything ou are running has to be written to be colaborative parallel using the loop. What you are doing is inside a single-threaded callback from Qt, youron_pb_start_clicked, you are starting an asyncio loop, and telling it to wait until all tasks placed in it are ready. This function won't return, and therefore, the Qt UI will block.Since you are already running a Qt application, you are better of using the Qt mechanisms for concurrency - besides the tasks that offloaded to other processes using a ProcessPoolExecutor.
In this case, since you do not seen to check the return value, you can simply submit all your tasks to the process pool, and return from your function.
If you need the return values, you have to register a callback function with Qt so that it can take the
taskslist you create in this function, and calldonein the futures inside them, to learn which are ready.