After grok filter definition in logstash, index cannot be created

hi all,
I am using logstash to ship log files into elasticsearch. first my configuration file of logstash is as following that successfully ship the files int elasticsearch and I can see the logs in kibana:

input {
file {
    path => "/home/srahimi/files/16.txt"
    type => "t16"
    start_position => "beginning"
  }
}
output {
  elasticsearch { 
    hosts => ["192.168.170.153:9200"]
    index => "mainframe"
    user => "logstash_internal5"
    password => "x-pack-test-password"
 }
  stdout { codec => rubydebug }
}

then, i defined grok filter as following:

input {
file {
    path => "/home/srahimi/files/16.txt"
    type => "t16"
    start_position => "beginning"
  }
}
filter {
  grok {
     match => {"message" => "%{NUMBER:time}:%{NUMBER:c1}#%{NUMBER:c2}#%{WORD:c3}#%{GREEDYDATA:log}"}
  }
}
output {
  elasticsearch { 
    hosts => ["192.168.170.153:9200"]
    index => "mainframe"
    user => "logstash_internal5"
    password => "x-pack-test-password"
 }
  stdout { codec => rubydebug }
}

my input is as following:

20190113:120102756#0025163#I1#120102757:120103620:120103620:000421376:020010:04109:00018732:002:P7T2:P7A6:0:BGEN:0000:0982:0162:

after using the grok filter, the logs cannot be shipped into elasticsearch and i cannot see the defined fields in kibana. notably, there is no error in logstash and elasticsearch logs.
any advice will be so appreciated.

If you ran logstash successfully with the first configuration then the second configuration is not going to do anything unless new data is appended to that file. If you want to re-read the existing data then you need to add this to the file input:

sincedb_path => "/dev/null"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.