I have a 200million row dataframe that I've sliced in-memory into chunks of 200,000 to insert gradually.
After running d6tstack.utils.pd_to_psql for 30 minutes or so (which is the halfway mark), I will get the following error:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "...", port ... failed: FATAL: sorry, too many clients already
I added the following lines to d6tstack.utils.py and the error stopped reproducing:
sql_cnxn.close()
sql_engine.dispose()