I have two database servers those are replicated in Master to Master replication process. For some reason replication got stopped. Now I want to compare same database, EXAMPLE on both server for data consistency. But the problem is the database has a very large table. The table has 60 Millions rows. The mysql data size is 10GB around. I have tried to use mysqldbcompare command of MySQL utilities. This tools works very well in low data size. But in this case after an hour connection is dropped eventually by MySQL utilities.
Can anyone help me in this problem? Is there anyone analyzed large number of MySQL data and have any experience?
Please tell me the best way to start with. What tools should I use and how to use that, because I need to do this in very less amount of time.
Below script should solve your problem of comparison. Its
divide and conquer algorithm
, I'm applying here. This code is for Windows, slight change will work for any other OS.Basically, here your data will be exported per table one file and then comparison will be done file by file i.e. table by table. This will reduce the data size.
I hope this should solve your problem.