I have about million records in a list that I would like to write to a Netezza table. I have been using executemany() command with pyodbc, which seems to be very slow (I can load much faster if I save the records to Excel and load to Netezza from the excel file). Are there any faster alternatives to loading a list with executemany() command?
PS1: The list is generated by a proprietary DAG in our company, so writing to the list is very fast.
PS2: I have also tried looping executemany() into chunks, with each chunk containing a list with 100 records. It takes approximately 60 seconds to load, which seems very slow.
Netezza is good for bulk loads, where executeMany() inserts number of rows in one go. The best way to load millions of rows is "nzload" utility which can be scheduled by vbscript, Excel Macro from Windows or Shell script from Linux.