ERROR: transaction is too large to complete; try splitting into pieces

I am trying to insert 40,000 records in a table in a single transaction using Hibernate (with Envers plugin for auditing). I am suing Postgres driver from Java application. I get the above error after 1,850 rows are inserted. Envers adds a pre-execute hook in Hibernate to load the existing records for auditing purposes. Hence, every insert query has a select query prior to this. This particular run failed on the select query with the error SQL Error: 0, SQLState: XX000 and the message "Caused by: org.postgresql.util.PSQLException: ERROR: transaction is too large to complete; try splitting into pieces
Other times, I have seen the failure on insert query with the error, 40001 for retry.

Is there a way to increase the transaction buffer? I have inserted lot of records with Postgres in a single transaction. For this test, I am using a 2-node CockroachDB cluster with both nodes running on the same machine as the application.

Hello Ramesh!

thank you for your interest in CockroachDB.
A few answers to your point:

I hope this helps; let me know if you have more questions or comments.

We are using Hibernate and Envers add-on to hibernate. Do you have a solution for this environment?

Hey @rsakala, I think we need a bit more info about this - can you clarify what your objective is using these particular applications to insert 40k records at once? Are you trying to migrate, or is this a typical workload for your application?

Are you still running on a local machine? As @knz mentioned, running multiple nodes on the same machine will negatively impact performance.