Logstash parse log nginx not index refresh

Hello,

I just finished my first installation ELK and I discover the differents elements that compose it. My first exercise is to parse nginx logs via Logstash and grok filter.

To do so, I created a configuration file in logstash/conf.d with the following configuration:

input {
     file { 
       path => ["nginx/access_log"]
       type => "nginx" 
       start_position => "beginning"
       sincedb_path => "/dev/null"
     }
}
filter {
      grok {
        match => { "message" => ["%{IPORHOST:[nginx][access][remote_ip]} - %{DATA:[nginx][access][user_name]} \[%{HTTPDATE:[nginx][access][time]}\] \"%{WORD:[nginx][access][method]} %{DATA:[nginx][access][url]} HTTP/%{NUMBER:[nginx][access][http_version]}\" %{NUMBER:[nginx][access][response_code]} %{NUMBER:[nginx][access][body_sent][bytes]} \"%{DATA:[nginx][access][referrer]}\" \"%{DATA:[nginx][access][agent]}\""] }
        remove_field => "message"
      }
      mutate {
        add_field => { "read_timestamp" => "%{@timestamp}" }
      }
      date {
        match => [ "[nginx][access][time]", "dd/MMM/YYYY:H:m:s Z" ]
        remove_field => "[nginx][access][time]"
      }
      useragent {
        source => "[nginx][access][agent]"
        target => "[nginx][access][user_agent]"
        remove_field => "[nginx][access][agent]"
      }
      geoip {
        source => "[nginx][access][remote_ip]"
      }
}
output {
      elasticsearch {
         hosts => ["localhost:9200"]
         index => "logstash-nginx-%{+YYYY.MM.dd}"
       }
}

Then run the test and create the index via the following command in elasticsearch and verify that the index has been correctly created.

logstash --debug --path.settings /etc/logstash -f /etc/logstash/conf.d/01-file.conf

curl -GET "localhost:9200/_cat/indices?v"
health status index                     uuid                   pri rep docs.count docs.deleted store.size pri.store.size
green  open   .kibana_task_manager      26zR2Q-dSl6sb3vfVaEwog   1   0          2            0     29.8kb         29.8kb
yellow open   logstash-nginx-2019.08.17 wBzhSCIVRJ2FRyfD4tlvqg   1   1         16            0     33.3kb         33.3kb
yellow open   logstash-nginx-2019.08.19 ofUfxMi5SP-IBBPSJy8ydw   1   1         21            0     26.8kb         26.8kb
green  open   .kibana_1                 GJpidYP5SbmUmwFfa8-xKw   1   0         16            1     80.9kb         80.9kb
yellow open   logstash-nginx-2019.08.18 ekS_ILToSriJy53TuRPGdA   1   1         18            0     33.5kb         33.5kb

Then in Kibana, I created visualizations and then my dashboard and it works except that the new entries in my log nginx are not taken into account and even if I do a refresh!

If I want my visualizations to take into account these new entries in the nginx log, I need to restart the creation of a new index with Logstash.

I think there is a process I did not understand for the index created by Logstash to be updated as real in elasticsearch.

If you could give me information, thanks in advance

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.