I’m actually try cockroach for a too large mysql database : our mysql servers cannot scale as cockroach can.
I have try to import a dev version of the concerned database : 31M .sql file… imported in 41 seconds without any problem
Now i’m trying with the production database… 82G .sql file… not finished actually after ~21h.
I have two questions :
- How can we speedup a large import like this one ? (we have a 3 node dev cluster… launched binary file)
- Can we have information about import progress somewhere ?
I’m on 2.1.0 version
One node was start with “start --insecure”
Two other wih “start --join=firstnode --insecure”
.sql file were put in “cockroach-data/extern/local/”
Import launched like the documentation “import mysqldump ‘nodelocal:///local/file.sql’;”