Best approach to creating mapping in elasticsearch/logstash

Hello everyone,

I am new to ELK stack. Currently in the processing of deploying POC for migrating our small splunk deployment to ELK. One of the problem I am having is trying to create mapping in elasticsearch.

Currently all the logs are saved in syslog server with application specific filenames. So, logs are sent as filebeat -> logstash -> elasticsearch -> And getting dashboard in kibana.

I am creating multiple pipelines specific to application in logstash, parsing the fields and saving to elasticsearch as appname-*$(mm-dd-yyyy) indices.

Now by default, the fields do not get mapped correctly. So do I just use GET /<index_name>/_mapping. Copy the output and change and save as new template in logstash or is there any better way to do it ?

Appreciate your help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.