Index a file line by line in logstash

I am using logstash to ingest multiple application logs into elasticsearch. I have setup the filter to ingest logs which are working fine.

For one application I need to index the log file line by line, without any mapping/filtering, can someone tell me how I can do that from logstash filter. I cannot seem to find any filter that would do it.

Hello,

First I suppose that you can filter your logstash processing rules between your different applications.

If so, i would use only your input and an output rule (without any filtering/mapping) like i do on one of my servers:

input {       #adapt your input to your needs
        beats {
        port => "8754"
        }
}

filter { #comment this line
}        #this line too

output {
        if [logtype] == "log_prd" {
                                        file {
                                        path => "/opt/dir/log_prd_%{+yyyy_MM_dd}.log"
                                        codec => line { format => "%{message}"}
                                        }
        }
        else if [logtype] == "log_dev" {
                                        file {
                                        path => "/opt/dir/log_dev_%{+yyyy_MM_dd}.log"
                                        codec => line { format => "%{message}"}
                                        }
        }
}

On my side, I am filtering the output on a field added on filebeat side ("logtype"), but you can just use your own separation method

Hi,

Thanks for the reply, I understand what your are saying, although I have already applied a filter in my logstash filter section which works based on condition, since I do need to filter the logs coming from my other applications.

My output section is common which only connects to the elasticsearch api, please see my config file below:

input {
  beats {
     port => 5044
     }
}

filter {
  if [app] == "pythoncron" {
  grok {
      match => { "message" => "%{DATA:data}" }
    }
  }
  else {
  dissect {
    mapping => {
        message => "%{timestamp}|%{application}|%{module}|%{traceid}|%{severity}|%{info}"
      }
    }
  }
}

output {
   elasticsearch {
     manage_template => false
     hosts => "http://localhost:9200"
     index => "%{[env]}-%{[app]}"
  }
stdout { codec => "dots" }
}

As you can see, I have tried to ingest logs from application "pythoncron" line by line without any filter, and I have attempted this by putting some sort of generic filter (message" => "%{DATA:data} ) which does not seem to be working correctly.

So not sure if I can implement your suggestion above in my "output" section based on the conditions. Or may be I do not understand your suggestion completely :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.