When trying to import records that contain “large” data, I’m experiencing the following errors:
(1) Using IMPORT PGDUMP
syntax:
Failed running "sql"```
(2) Using `cockroach sql < dump.sql`:
```driver: bad connection Error: driver: bad connection
Failed running "sql"```
It appears if any `INSERT` statement is over 16MB in length, it will fail with the above error.
It would seem a valid fix would be to be able to provide a number of values to batch per insert when performing the dump, or even have `dump` itself detect the length and batch appropriately - even if it is extremely wasteful by doing something like `floor(100,16MB/maxRowLength)`. Of course 100 may be too small for some datasets and being able to set this larger would also help these datasets get loaded faster.