i'm was using syslog module from filebeat to send data to elasticseach.
In my kibana a can saw my data with all the additionnal field (ex: system.syslog.hostname, system.syslog.message system.syslog.pid, etc ...)
Now i have to check an other file with the same filebeat.
I use logstash for parse that file and redirect syslog file to filebeat index but i lost all the syslog field.
How can i keep the syslog field from syslog module of filebeat inside my logstash/elasticsearch ?
Here my pipeline :
input {
beats {
port => "5043"
}
}
filter {
if [type] == "cpu_log" {
grok {
match => {
"message" => "%{SYSLOGBASE} %{NUMBER:status:int}"
}
}
}
}
I think i find a beginning of solution.
i find in the doc the attribute pipeline in elastisearch output (logstash conf) . I'm trying use it for apply it to my different log. But i got error at that step.
WARN logstash.outputs.elasticsearch - Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2017.05.30", :_type=>"log", :_routing=>nil, :pipeline=>"filebeat-5.4.0-system-auth-pipeline"}, ******************, :response=>{"index"=>{"_index"=>"filebeat-2017.05.30", "_type"=>"log", "_id"=>nil, "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"pipeline with id [filebeat-5.4.0-system-auth-pipeline] does not exist"}}}}
it's said id pipeline "filebeat-5.4.0-system-auth-pipeline" does exist. But in my kibana dev tool the requet : GET _ingest/pipeline/filebeat-5.4.0-system-auth-pipeline send me back the pipeline.
I'm little lost....
here my news logstash pipeline
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.