Not recognised pattern


(DFrant) #1

Hello,

I have a log file, that I have given a [fields][type] == carxx in my filebeat configuration to allow me to do some pattern matching with logstash.
But, I have seen that some data (approximatively 1/3) are not parsed with the logstash filter and fill the filebeat-* index instead of my created carxx-* index...
I checked my pattern matcher online and it matches perfectly the unparsed ones..

Here is my configuration:

beats.conf

input {
        beats {
                port => "5044"
        }
}


filter {

        if [fields][type] == "carxx" {

                grok {
                        patterns_dir => ["/etc/logstash/conf.d/patterns/"]
                        match => { "message" => "%{ID:id}%{CARXXTIMESTAMP:carxxtimestamp}[%|$]%{TYPE:typetx}[%|$]%{NNUTILISATEUR:nnutilisateur}%{CODETX:codetx}%{NNDOSSIER:nndossier}%{IOUT:iout}%{HEUREREPONSE:heurereponse}%{INS:ins}%{IIN:iin}%{DATA}$" }
                }
        } 

}


output {

        elasticsearch {
                hosts => ["192.168.1.21:9200"]
                index => "%{[fields][type]}-%{+YYYY.MM.dd}"
        }

}

filebeat.ym

filebeat.prospectors:
- input_type: log
  paths:
    - /opt/carxx/carxx_test1
  fields:
    type: carxx
- input_type: log
  paths:
    - /opt/xml_log/xml_test1.log
  fields:
    type: txlog
fields:
  env: staging
output.elasticsearch:
  hosts: ["192.168.1.21:9200"]
  template.path: "filebeat.template.json"
  template.overwrite: true
output.logstash:
  hosts: ["192.168.1.21:5044"]
logging.level: debug

(Magnus B├Ąck) #2

This is a duplicate of Duplicated indices.


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.