Logstash- issue to parse logs from multi location

--------- filebeat.yml ------
filebeat.prospectors:

  • type: log

    Change to true to enable this prospector configuration.

    enabled: true
    paths:

    • /root/airflow/logs/scheduler//.log
      fields: {log_type: airflow_dags_log}
      paths:
    • /mnt/nfs/qa/logs/unpackdecrypt*_process.log
      fields: {log_type: unpack_process_logs}

--------- input.conf ------
input {
beats {
port => 5044
}
}
--------- output.conf ------
output {
if [fields][log_type] == "airflow_dags_log"
{
elasticsearch {
hosts => ["localhost:9200"]
index => ["airflow_dags_log"]
}

	}
	if [fields][log_type] == "unpack_process_logs"
	{
			elasticsearch {
					hosts => ["localhost:9200"]
					index => ["unpack_process_logs"]
			}
		   
	}	

}
--------- filter.conf ------
filter {

	if [fields][log_type] == "unpack_process_logs"
	{
		csv {
            separator => "|"
            columns => ["LogLevel","LogDate","LogTime","SourceFile","LineNo","filler","ClientShortName","FileName","FileLocation","FileDate","FileSize","FrtId","Status"]
			}
    }
	
	if [fields][log_type] == "airflow_dags_log"
    {
        csv {
            separator => " "
            columns => ["LogDate","ProcessID","INFO"]
            }
    } 		

}

Error :

[2018-07-25T02:12:17,245][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-07-25T02:12:17,252][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-07-25T02:12:17,272][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-07-25T02:12:17,297][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}

Hello @2anilkumar,

It's seems like a version compatibility issue.

Thanks & Regards,
Krunal.

Yes, log shows, but I'm using fields: {log_type: unpack_process_logs}

Detected a 6.x and above cluster: the type event field won't be used to determine the document _type

This was WARN, that still exits.
My issue was due to existing pattern, I was not able to create new index. That's resolved now.
PUT _settings
{
"index": {
"blocks": {
"read_only_allow_delete": "false"
}
}
}

Delete similar pattern index & recreate.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.