Bulk import csv file

I’m trying to populate my table with a csv file.

at first I tried the template command with a csv file of 10 rows:

| IFS=","; while read a b c; do echo “INSERT INTO csv VALUES ($a, $b, $c);”; done < test.csv;

and it worked, but when I tried to use the original file, I got the following error:

driver: bad connection
connection lost; opening new connection and resetting session parameters…

Which I think is because of 16 MB limit,

is there any other way I can import the csv file?


Edit 1:
I reduced number of rows so that the size of the file is 12MB, but the error is still the same.


Edit 2:
I reduced number of rows so that the size of the file is less than 10MB, and worked,
but still looking for a solution to load all rows not a limited size.

This is almost certainly caused because we have internal limits of the number of size of rows that can be inserted in a single transaction.

You can fix this by not runing this as a single transaction. If you have a file of INSERTs separated by semicolons and newlines, that should work, although it may be slow.

1 Like

But still, nothing should crash the server (or otherwise close the client connection). So some server logs would be priceless.

1 Like

Thanks,
would you please explain more about the file of INSERTs?

do you mean to split the original csv file into smaller files and a insert statement for each csv file?

what should be the format and extension of the file of INSERTs?
any documentation/link I can follow?

Generate a file that has one INSERT per line first. Then pipe that into cockroach sql. If the file has no BEGIN statement it should execute each line individually, which may fix the problem you are seeing above.

However andrei brings up a really good point that it shouldn’t be crashing anyway, so it may not be that bug. We need to see some logs or the offending INSERT statement to help out more.

1 Like