I'm creating an application that should do a mysqldump from mysql running in a docker container. The application i'm creating is build in node.
This is the script that i'm using.
const containerId = '71501a8ab0f8';
const database = 'my-db';
const exportPath = `${database}.sql`;
const docker = new Dockerode({socketPath: '/var/run/docker.sock'});
const container = docker.getContainer(containerId);
const exec = await container.exec({
{
Cmd: [
'mysqldump',
'--single-transaction',
database,
],
AttachStdin: true,
AttachStdout: true
}
});
const stream = await exec.start({
hijack: true,
stdin: false
});
const writeStream = fs.createWriteStream(exportPath);
stream.pipe(writeStream);
stream.on('end', () => {
console.log('Database dump successfully saved!');
});
This creates the sql file with dump of the database, but it's not really readable. When I do a file -I my-db.sql
I get the following result:
my-db.sql: application/octet-stream; charset=binary
When I open in in a text-editor (sublime) I see a random sequence of characters.
But when I open de file with for example nano
I just see the (plain text) content of the mysqldump.
What I noticed is that every chuck that was added to the generated file starts with some random characters. When I remove them manually and do the file -I my-db.sql
again the result is:
my-db.sql: text/plain; charset=us-ascii
Now I'm also able to open the file in my text editor and see the actual dump.
I tried to just writes the chunks to the file when data arrives, like:
stream.on('data', (chunk: Buffer) => {
writeStream.write(chunk.toString());
});
But this results in the same issue.
Since the script has to able to make dumps of large databases I really want to use the stream and pipe it to a file.
How can I get rid of the "characters" that are added before each inserted chuck so the file mime-type will be text/plain
.
Just found the anwser myself. The stream needed to be demux first.