I'm new to logstash and I have been looking through the help guides and through the forums and I can't seem to find the answers I need. I am running ELK 6.6.0. I have 3 inputs I'd like to get into Elasticsearch. Syslog, a Filebeat reading a CSV file (or IPMI sensor data) and another stream that is data retrieved from a restful endpoint.
I got the IPMI sensor data working. I created an index with ML data import and managed to the data into the index using a grok pattern and output. I tried to create a different pipeline for syslog input but didnt seem to pick up multiple pipelines and I am now trying to use tags to redirect the output to different indexes because if I leave it all going to logstash the data gets mixed up and ipmisensor fields get populated with syslog data.
the logs start telling me as well I am using document types that will be depreciated which is some of the useful posts I did find. So if I cant use document type, what is the best way to do this?