Pandas to_sql append parameter not inserting dataframe into table

81 Views Asked by At

I'm trying to create a script that inserts ~6000 rows into a SQL table. I'm using the pandas to_sql function with the 'append' parameter. I'm finding that if the table is deleted, the function runs without error and inserts the data into the table. But once the table has been created and has data inserted into it, it does not allow me to run the function to insert a dataframe into the table.

Below is an example of the code that I'm using, with details hidden.

I get the error "[42501][Microsoft][SQL Server Native Client 11.0] [SQL Server] There is already an object named "table_name" in the database. (2714) (SQLExecDirectW)" when running the code. I'm not sure why this is an error since the 'append' parameter should be able to append the dataframe to the table if the table exists. I find that even with the if_exists parameter is set to "replace" I get the same error. It seems like the code isn't noticing the if_exists parameter at all.

params = urllib.parse.quote_plus(sqlparams) engn =    
engn = create_engine("mssql+pyodbc://...".format(params))

dfToLoad.to_sql('table_name',con=engn, if_exists = 'append',
schema='schema_name', index=False)

I've tried both the "replace" and the "append" for the if_exists parameter and neither are working if the table exists. I've also tried using SQLAlchemy to insert into the table with no luck. Has anyone noticed this error and found a solution?

I've found from other posts that if you use the connection object as your connection engine instead of the engine created, it might cause this error of no data being inserted. But I have confirmed that my code does insert some data if the table doesn't exist and that I am using the create_engine() as my engine parameter in the to_sql function.

edit: I was able to find a solution. I was using a schema on my own database profile, and just creating a schema that will house the table fixed this error. I'm not sure why it wasn't able to load the data into a table on my own schema but for now this is resolved.

0

There are 0 best solutions below