Very slow mysql import


I’m actually try cockroach for a too large mysql database : our mysql servers cannot scale as cockroach can.
I have try to import a dev version of the concerned database : 31M .sql file… imported in 41 seconds without any problem :slight_smile:
Now i’m trying with the production database… 82G .sql file… not finished actually after ~21h.

I have two questions :

  • How can we speedup a large import like this one ? (we have a 3 node dev cluster… launched binary file)
  • Can we have information about import progress somewhere ?

I’m on 2.1.0 version
One node was start with “start --insecure”
Two other wih “start --join=firstnode --insecure”
.sql file were put in “cockroach-data/extern/local/”
Import launched like the documentation “import mysqldump ‘nodelocal:///local/file.sql’;”

For information, it’s now ended :slight_smile: :

root@:26257/test> import mysqldump ‘nodelocal:///local/file.sql’;

    job_id       |  status   | fraction_completed |    rows    | index_entries | system_records |    bytes

397995641726631937 | succeeded | 1 | 1024628573 | 1605325442 | 0 | 98904246724
(1 row)

Time: 23h32m11.293565694s

Only 23h32… any idea to speedup ?

Hey @fprudhom,

Could you provide us with some more information regarding the import? Can you tell me how many tables you were importing, how many indexes, the schemas?



Hey @fprudhom,

Just wanted to reach back out again and provide an alternate solution to importing. We have a CSV import that is at least 4x faster than the mysqldump import. The reason behind this is that the CSV import takes advantage of multiple cores if they are available, while the mysqldump import is restricted to one core for approximately half of the work.

If you wanted to try the CSV import out, the syntax looks like this:

IMPORT TABLE <TableName> CREATE USING 'location://ddl.sql' CSV DATA ('file1.csv', 'file2.csv', etc...)

You can also read more on this import statement here.

We also have a github issue that you can follow and +1 regarding the mysqldump import here

Let me know if you have any questions.


Hi, there is 84 tables (all have index and foreign keys). I can’t give you all details due to confidentiality problem.
We have 4 big tables : 2 with 40GB of data and ~400k lines, and 2 others with 10GB with ~100k lines

Many thx for answer about the issue, i will follow it.
I will try to do a csv import and give you result in a few days.

Thanks again :slight_smile:

Yes, please let us know hot it goes!

@ronarev i am trying to import mysql database schema in which there are 12033 tables. I am getting error as “Error: pq: could not mark job 417758005292695554 as succeeded: creating tables: restoring table desc and namespace entries: table already exists”
Please help what’s wrong happening.

Hey @vikram.malvi,

Sorry for the delay in response. Are you still having this issue? If so, could you provide some more detail around the error? Did it happen right as you started the import, or did the import run for a bit and then fail? How did you try to import your mysql database?



Hey @ronarev, Thanks for the response, i made it worked. Actually the problem was my table name are in numeric and Cockroachdb does not allow to create table using numeric value if you do not have quotes “”. I was doing import through MySQL IMPORT utility and tables were getting created as _2354 instead of 12354.

Hey @vikram.malvi,

Glad to hear you were able to resolve this issue! As always feel free to post any questions you may have on our forum.