Couldn't find any Elasticsearch data on kibana?

this is my logstash.conf

input {
  beats {
    port => 5044
  }
}

filter {
  if [fields][log_type] == "syslog" {
    grok {
      match => { "message" => "{WORD:name},%{GREEDYDATA:date},%{NUMBER:CLNo},%{WORD:status}" }
    }
    grok{
       match => { "source" => "%{UNIXPATH}/%{YEAR:yr}%{MONTHNUM:mn}%{MONTHDAY:md}.%{HOUR:hr}%{MINUTE:min}.min"}
    }
  }
  if ("_grokparsefailure" in [tags]) { drop {} }
  else {
    mutate {
      convert => {
        "CLNo" => "integer"
      }
      add_field => {
        "creationTime" => "%{yr}%{mn}%{md} %{hr}:%{min}"
      }
      remove_field => [ "[yr]","[mn]","[md]","[hr]","[min]" ]
    }
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    sniffing => true
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

this is my file from where i am fetching data:-

name,date,CLI number,review status
xyz,12.12.12,123,completed
abc,12.12.12,1234,ongoing

this is my filebeat.yml

filebeat:
  prospectors:
    -
      paths:
        - /var/log/cl_data.txt
      #  - /var/log/syslog
      #  - /var/log/*.log

      input_type: log

      fields:
         log_type: syslog

  registry_file: /var/lib/filebeat/registry

output:
  logstash:
    hosts: ["1.1.1.1:5044"]
    bulk_max_size: 1024

can you please tell why the data is not displayed on kibana???
please Help!!!

can anybody please help??

@Badger can you please help me with it ?

Your grok patterns do not match, so everything gets dropped. For the first one you could change it to

"^%{WORD:name},%{GREEDYDATA:date},%{NUMBER:CLNo:int},%{WORD:status}$"

Why not use a csv filter?

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.