Large 100s GB Import

Hi,

We current have a large MS SQL database that I’d like to look at accessing through CockroachDB, mainly to take advantage of the distributed query functionality.

At present, I not sure on the best method to get this data in, all the examples indicate a couple MB files. and raw SQL inserts.

Does anybody have any ideas.

I can think of a slow-running job/service to queue and transfer the data across. I’d appreciated any ideas.

Peter

Isn’t that how normal mysql-dump works? Raw INSERT statements?

I seem to have the best (quickest time) results splitting it into multiple streams, and feeding a portion to each node at the same time. (Tried with hundreds of MB, not GBs), and try to keep the individual inserts under 50k. Generally it should also help to add indexes after important the data, but I didn’t bother for my tests.