skip to Main Content

I have a long log file that I need to read from the bottom, line by line, stopping when a certain condition is met, and then returning that line. Also this needs to work on Windows. The file will probably be pretty long, so I don’t want to read the entire file and then reverse it.

I’m currently using reverse-line-reader, however it hasn’t been updated since 2015 and the async functionality leaves much to be desired.

Every other package I’ve seen is either archived, requires me to know the number of lines I want ahead of time, or doesn’t process by lines.

Is there any way to do this in vanilla Node, or is there a package that I haven’t seen that does what I need?

Current code, if it’s at all helpful to explain what I need:

reverseLineReader.eachLine(this.#path, (raw: string, last: boolean) => {
    if (raw) { // skip blank line at end of file
        const line = JSON.parse(raw)

        if (line.event === 'FSDJump') {
            this.location = new System(line)
            Log.write(`Current location set to ${this.location.name}.`)
            this.emit('ENTERED_NEW_SYSTEM')
            return false // stop reading

        } else if (last) {
            log('Unable to find last hyperspace jump. Searching for last known location.')
            return false
        }
    }
})

Thanks!

2

Answers


  1. Here is a basic vanilla node implementation of a readBackwards method that reads a number of bytes from the end of a file.

    For simplicity it opens the file, uses stat to get the file size and then uses read with an offset calculated from the end of the file.

    If you need to read more from the end of the file you should call stat outside the method and pass in the size as an argument to not call stat unnecessarily often. Then you can basicly increase the positionFromEnd in each iteration by the size of your buffer to have it slide over the file from end to beginning.

    const fs = require("fs");
    const { Buffer } = require("buffer");
    
    function readBackwards(
      file,
      buffer,
      offsetInBuffer,
      length,
      positionFromEnd,
      callback
    ) {
      fs.open(file, "r", 0666, (err, fd) => {
        if (err) return callback(err);
    
        fs.fstat(fd, (err, stat) => {
          if (err) return callback(err);
    
          fs.read(
            fd,
            buffer,
            offsetInBuffer,
            length,
            Math.max(stat.size - length - positionFromEnd, 0),
            (err, bytesRead, buffer) => {
              if (err) return callback(err);
    
              fs.close(fd, (err) => {
                return callback(err, bytesRead, buffer);
              });
            }
          );
        });
      });
    }
    

    Try it out here:
    https://codesandbox.io/p/sandbox/loving-mestorf-hgro1p?file=%2Findex.js%3A8%2C13

    Edit:
    To read line by line you would concatenate what you have read during iterations until the first line-break occurs. Then you split on the line-break(s) and can process those completed lines before reading more from the file.

    Login or Signup to reply.
  2. As of my knowledge cutoff in 2021, Node.js does not support reading a file from end to start directly. As a result, my previous suggestion would not work as it attempts to create a read stream starting from the end of the file. This is a mistake on my part.

    You might have to read the file normally and keep track of the lines yourself. If reading the entire file into memory is not feasible due to its size, you could look into using a library that supports this functionality.

    One possible package is "read-last-lines". This library reads from the end of a file upwards. Although it does require knowing the number of lines to read ahead of time, you can read a fixed number of lines, process them, and if you don’t find the line you’re looking for, read the next chunk of lines.

    Here is how you might use "read-last-lines" to achieve what you’re looking for:
    `const readLastLines = require(‘read-last-lines’);

    async function findLastFSDJump(filePath) {
      const chunkSize = 100; // the number of lines to read at once
      const fileSize = fs.statSync(filePath).size;
      let endPosition = fileSize;
    
      while (endPosition > 0) {
        const startPosition = Math.max(0, endPosition - chunkSize);
        const chunk = await readLastLines.read(filePath, chunkSize);
        const lines = chunk.split('n');
    
        for (let i = lines.length - 1; i >= 0; i--) {
          const line = JSON.parse(lines[i]);
    
          if (line.event === 'FSDJump') {
            return line;
          }
        }
    
        endPosition = startPosition;
      }
    
      return null;
    }
    
    findLastFSDJump('./path/to/your/file')
      .then(line => console.log(line))
      .catch(err => console.error(err));`
    

    In this code, findLastFSDJump reads the file in chunks from the end upwards until it finds a line where event === ‘FSDJump’. If it doesn’t find such a line, it will return null.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search