Read plain or compressed log files from disk, deliver as [batches of] lines to a log consumer. Wait for the log consumer to report success. Advance bookmark. Repeat ad infinitum.
npm i safe-log-reader
const read = require('safe-line-reader');
read.createReader(filePath, {
batchLimit: 1024,
bookmark: {
dir: path.resolve('someDir', '.bookmark'),
}
})
.on('readable', function () { this.readLine(); })
.on('read', function (line, count) {
// do something with this line of text
})
.on('end', function (done) {
// close up shop and go home
});
The key "missing" feature of the node "tail" libraries is the ability to resume correctly after the app has stopped reading (think: kill -9) in the middle of a file.
Because files are read as [chunks of] bytes and log entries are lines, resuming at the files last byte position is likely to be in the middle of a line, or even splitting a multi-byte character. Extra buffered bytes not yet emitted as lines are lost, unless at restart, one rewinds and replays the last full $bufferSize. Then the consuming app needs to have duplicate line detection and suppression.
The key to resuming reading a log file safely is tracking line numbers and the byte steam offset the consuming app has committed. When saving bookmarks, the file position advances to the byte offset coinciding with the byte position of the last line your application has safely commited.
Safe-log-reader uses a Transform Stream to convert the byte stream into
lines. This makes it dead simple to read compressed files by adding
a .pipe(ungzip())
into the stream.
Copyright 2015 by eFolder, Inc.