Skip to content

Instantly share code, notes, and snippets.

@magicsih
Created March 24, 2023 05:10
Show Gist options
  • Save magicsih/c99c50e22ecbf820e00edb2934fa8028 to your computer and use it in GitHub Desktop.
Save magicsih/c99c50e22ecbf820e00edb2934fa8028 to your computer and use it in GitHub Desktop.
This Node.js script reads data from a file and inserts it into a MySQL database using bulk insert. It initializes a read stream from the file and an empty array to store the data. As the stream reads data, it converts each line from a hexadecimal string to a Buffer object and adds it to the data array. Once the data array reaches a certain size …
const fs = require('fs');
const mysql = require('mysql');
// MySQL database connection configuration
const connection = mysql.createConnection({
host: '<host>',
user: '<user>',
password: '<password>',
database: '<database>'
});
// File path and MySQL query for bulk insert
const filePath = 'data.txt';
const query = 'INSERT IGNORE INTO TABLE_NAME (COLUMN_NAME) VALUES ?';
// Create a read stream from the file path
const stream = fs.createReadStream(filePath);
// Initialize an array to store the data
let data = [];
// Function to process the data and execute bulk insert
function processData() {
// Copy the data array and reset it
const values = [...data];
data = [];
// Execute bulk insert on the MySQL database
connection.query(query, [values], (error, results, fields) => {
if (error) throw error;
console.log(`Inserted ${results.affectedRows} rows.`);
});
}
// Function to be called every time the stream reads data
stream.on('data', (chunk) => {
// Convert the chunk to a Buffer object and split it into lines
const lines = chunk.toString().split('\n');
// Process each line and add it to the data array
for (const line of lines) {
// Convert the line from a specific format (in this case, hexadecimal) to a Buffer object
if (line.length > 0) {
data.push([Buffer.from(line, 'hex')]);
}
}
// If the data array reaches a certain size, execute bulk insert
if (data.length >= 1000) {
stream.pause();
processData();
}
});
// When the stream has finished reading the entire file
stream.on('end', () => {
// If there is any remaining data, execute bulk insert
if (data.length > 0) {
processData();
}
console.log('All data has been processed.');
connection.end();
});
// Resume the stream when the MySQL buffer has drained
connection.on('drain', () => {
stream.resume();
});
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment