Set index depending on hostname/custom field

Hello,

I am setting a custom field in filebeat for environment depending upon the server.
I am then trying to use that field to create a seperate index depending on the field value. But am getting an error every time. Not sure what I am doing wrong. This is my LS config file

input {
  beats {
    port => 5044
}
}
output
{
if [env] == "DEV" {
    elasticsearch
    {
        hosts =>[ "10.204.16.105:9200"]
        index => "elk-dev"
        document_type => "log"
    }
    }
else if [env] == "QA" {
 elasticsearch
    {
        hosts =>[ "10.204.16.105:9200"]
        index => "elk-qa"
        document_type => "log"
    }
    }
else if [env] == "CERT" {
 elasticsearch
    {
        hosts =>[ "10.204.16.105:9200"]
        index => "elk-cert"
       document_type => "log"
    }
    }
}

I get some errors like below.

ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
Sending Logstash's logs to /etc/logstash-5.5.2/logs which is now configured via log4j2.properties
[2017-12-06T19:33:21,555][WARN ][logstash.runner          ] --config.debug was specified, but log.level was not set to 'debug'! No config info will be logged.
[2017-12-06T19:33:22,060][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://10.204.16.105:9200/]}}
[2017-12-06T19:33:22,069][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://10.204.16.105:9200/, :path=>"/"}
[2017-12-06T19:33:22,141][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://10.204.16.105:9200/"}
[2017-12-06T19:33:22,144][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-12-06T19:33:22,179][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-12-06T19:33:22,183][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//10.204.16.105:9200"]}
[2017-12-06T19:33:22,186][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250}
[2017-12-06T19:33:22,565][INFO ][logstash.inputs.beats    ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2017-12-06T19:33:22,594][INFO ][logstash.pipeline        ] Pipeline main started
[2017-12-06T19:33:22,678][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Doesn't really look like there's an issue but i don't see any logs in kibana.
Please help.

I can also provide the filebeat config.

filebeat:                                  
prospectors:                               
- input_type: log                          
  paths:                                   
    - /tmp/Messages/LOGQ1/*                
  encoding: plain                          
  fields_under_root: true                  
  exclude_lines: ["DMPMQMSG|^N|Queue|Qmgr"]
  document_type: qalog                     
  scan_frequency: 10s                      
  harvester_buffer_size: 16384             
  max_bytes: 10485760                      
  #index: iafelk-qa                        
  fields:                                  
    env: QA                                
  multiline.pattern: '<LogRecord'          
  multiline.negate: true                   
  multiline.match: after

You can add custom field by mutate , it can be used in output.

If I'm not mistaking you need to add fields to your if line. "if [fields][env] == "DEV" {"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.