Multiple pipelines or tags or if or else if?

I'm new to logstash and I have been looking through the help guides and through the forums and I can't seem to find the answers I need. I am running ELK 6.6.0. I have 3 inputs I'd like to get into Elasticsearch. Syslog, a Filebeat reading a CSV file (or IPMI sensor data) and another stream that is data retrieved from a restful endpoint.

I got the IPMI sensor data working. I created an index with ML data import and managed to the data into the index using a grok pattern and output. I tried to create a different pipeline for syslog input but didnt seem to pick up multiple pipelines and I am now trying to use tags to redirect the output to different indexes because if I leave it all going to logstash the data gets mixed up and ipmisensor fields get populated with syslog data.

the logs start telling me as well I am using document types that will be depreciated which is some of the useful posts I did find. So if I cant use document type, what is the best way to do this?

Not sure which of those issues you want to address, but I will assume you want to send different types of data to different indexes based on tags. There are two approaches. One is to use conditionals in the output section.

output { 
    if "somestring" in [tags] {
        elasticsearch { "index" => "Foo" }
    } else if "otherstring" in [tags] {
        elasticsearch { "index" => "Bar" }
    } else {
        elasticsearch { "index" => "Baz" }
    }
}

Another is to use conditionals to put the index name into a field

filter {
    if "somestring" in [tags] {
         mutate { add_field => { "[@metadata][indexPrefix]" => "Foo" } }
    } else if "otherstring" in [tags] {
         mutate { add_field => { "[@metadata][indexPrefix]" => "Bar" } }
    } else {
         mutate { add_field => { "[@metadata][indexPrefix]" => "Baz" } }
    }
}
output {
    elasticsearch { index => "%{[@metadata][indexPrefix]}-%{+YYYY.MM}" }
}

Thanks for your response badger it got me closer. However, I was still getting an errors trying to import. Basically a lot to do with _doc and doc error I think because I created the index with ML, a date parsing issue with CSV and the header line that was a red herring and a few other things.

logstash | [2019-03-27T00:35:47,065][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"ipmisensors", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x293119c5], :response=>{"index"=>{"_index"=>"ipmisensors", "_type"=>"doc", "_id"=>"WWuTvGkBsq3oc-rqCpPS", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Rejecting mapping update to [ipmisensors] as the final mapping would have more than 1 type: [_doc, doc]"}}}}

While I can feel the power is right there and I have enough of a POC working but definately some issues just trying to get more than the 'hello world' examples off the ground.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.