skip to Main Content

I’m playing a bit with kibana to see how it works.

i was able to add nginx log data directly from the same server without logstash and it works properly. but using logstash to read log files from a different server doesn’t show data. no error.. but no data.

I have custom logs from PM2 that runs some PHP script for me and the format of the messages are:

Timestamp [LogLevel]: msg

example:

2021-02-21 21:34:17 [DEBUG]: file size matches written file size 1194179

so my gork filter is:

"%{DATESTAMP:timestamp} [%{LOGLEVEL:loglevel}]: %{GREEDYDATA:msg}"

I checked with Gork Validator and the syntax matches the file format.

i’ve got files that contain the suffix out that are debug level, and files with suffix error for error level.

so to configure logstash on the kibana server, i added the file /etc/logstash/conf.d/pipeline.conf with the following:

input {
    beats {
        port => 5544
    }
}
filter {
 grok {
   match => {"message"=>"%{DATESTAMP:timestamp} [%{LOGLEVEL:loglevel}]: %{GREEDYDATA:msg}"} 
 }
mutate {
    rename => ["host", "server"]
    convert => {"server" => "string"}
}
}

output {
    elasticsearch {
        hosts => "http://localhost:9200"
        user => "<USER>"
        password => "<PASSWORD>"
    }
}

I needed to rename the host variable to server or I would get errors like Can't get text on a START_OBJECT and failed to parse field [host] of type [text]

on the 2nd server where the pm2 logs reside I configure filebeat with the following:

- type: filestream
  enabled: true
  paths:
    - /home/ubuntu/.pm2/*-error-*log
  fields:
   level: error
- type: filestream
  enabled: true
  paths:
    - /home/ubuntu/.pm2/logs/*-out-*log
  fields:
   level: debug

I tried to use log and not filestream the results are the same.
but it makes sense to use filestream since the logs are updated constantly on ?

so i have logstash running on one server and filebeat on the other, opened firewall ports, i can see they’re connecting but i don’t see any new data in the Kibana logs dashboard relevant to the files i fetch with logstash.

filebeat log always shows this line Feb 24 04:41:56 vcx-prod-backup-01 filebeat[3797286]: 2021-02-24T04:41:56.991Z INFO [file_watcher] filestream/fswatch.go:131 Start next scan and something about analytics metrics so it looks fine, and still no data.

I tried to provide here as much information as I can, i’m new to kibana, i have no idea why data is not shown in kibana if there are no errors.

I thought maybe i didn’t escaped the square brackets properly in gork filter so I tried using "%{DATESTAMP:timestamp} \[%{LOGLEVEL:loglevel}\]: %{GREEDYDATA:msg}" which replaces [ with \[ but the results are the same.

any information regarding this issue would be greatly appreciated.

#update
ֿ
using stack version 7.11.1

I changed back to log instead of filestream based on @leandrojmp recommendations.

I checked for harverser.go related lines i filebeat and I found these:

Feb 24 14:16:36 SERVER filebeat[4128025]: 2021-02-24T14:16:36.566Z        INFO        log/harvester.go:302        Harvester started for file: /home/ubuntu/.pm2/logs/cdr-ssh-out-1.log
Feb 24 14:16:36 SERVER filebeat[4128025]: 2021-02-24T14:16:36.567Z        INFO        log/harvester.go:302        Harvester started for file: /home/ubuntu/.pm2/logs/cdr-ftp-out-0.log

and I also noticed that when i configured the output to stdout, i do see the events that are coming from the other server. so logstash do receive them properly but for some reason i don’t see them in kiban.

2

Answers


  1. Chosen as BEST ANSWER

    ok... so @leandrojmp helped me a lot in understanding what's going on with kibana. thank you! all the credit goes to you! just wanted to write a log answer that may help other people overcome the initial setup.

    lets start fresh

    I wanted one kibana node that monitors custom logs on a different server. I have ubuntu latest LTS installed on both, added the deb repositories, installed kibana, elsaticsearch and logstash on the first, and filebeat on the 2nd.

    basic setup is without much security and SSL which is not what i'm looking for here since i'm new to this topic, everything is mostly set up.

    in kibana.yml i changed the host to 0.0.0.0 instead of localhost so i can connect from outside, and in logstash i added the following conf file:

    input {
        beats {
            port => 5544
        }
    }
    filter {
     grok {
       match => {"message"=>"%{DATESTAMP:timestamp} [%{LOGLEVEL:loglevel}]: %{GREEDYDATA:msg}"} 
     }
    mutate {
        rename => ["host", "server"]
        convert => {"server" => "string"}
    }
    }
    
    output {
        elasticsearch {
            hosts => ["http://localhost:9200"]
        }
    }
    

    i didn't complicate things and didn't need to set up additional authentication.

    my filebeat.yml configuration:

    - type: log
      enabled: true
      paths:
        - /home/ubuntu/.pm2/*-error-*log
      fields:
       level: error
    - type: log
      enabled: true
      paths:
        - /home/ubuntu/.pm2/logs/*-out-*log
       level: debug
    

    i started everything, no errors in any logs but still no data in kibana, since i had no clue how elasticsearch stored it's data, i needed to find out how can i connect to elasticsearch and see if the data is there, so i executed curl -X GET http://localhost:9200/_cat/indices?v and noticed a logstash index, so i executed curl -X GET http://localhost:9200/logstash-2021.02.24-000001/_search and i noticed that the log data is presented in the database.

    so it must means that it's something with kibana. so using the web interface of kibana under settings I noticed a configuration called Index pattern for matching indices that contain log data and the input there did not match the logstash index name, so i appended ,logstash* to it and voila! it works :)

    thanks


  2. If you have output using both stdout and elasticsearch outputs but you do not see the logs in Kibana, you will need to create an index pattern in Kibana so it can show your data.

    After creating an index pattern for your data, in your case the index pattern could be something like logstash-* you will need to configure the Logs app inside Kibana to look for this index, per default the Logs app looks for filebeat-* index.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search