Indexing on the basis of fields in filebeat.yml

Hi,
I am using FB to send logs to LS-Shipper.
Event processing pipeline is FB-LS-Shipper->Redis->LS-Indexer->nGinx-->ES
Below is FB configuration.
Note: I have created one field Application with 'A' in uppercase.

filebeat:
  # List of prospectors to fetch data.
  prospectors:
    - 
      paths:
        - /var/log/messages
    
      input_type: syslogs
   
      fields:
 
        Application: tf

     
      fields_under_root: true
     
      document_type: syslogs
   
  registry_file: /var/lib/filebeat/registry

output:
  ### Logstash as output
  logstash:
    # The Logstash hosts
    hosts: ["shipper:5000"]
 
    # Optional TLS. By default is off.
    #tls:
      # List Iof root certificates for HTTPS server verifications
      #certificate_authorities: ["/opt/logstash-forwarder/SSL/abc-issuing.cer.pem"]

logging:

   to_files: true

  # To enable logging to files, to_files option has to be set to true
   files:
    # The directory where the log files will written to.
     path: /var/log/filebeat

    # The name of the files where the logs are written to.
     name: filebeat

    # Configure log file size limit. If limit is reached, log file will be
    # automatically rotated
     rotateeverybytes: 10485760 # = 10MB
 
   level: debug

I want to create Index %{Application}-YYYY-MM-DD so that each application logs will go in its own index.
For this, I have done LS-Indexer configuration as below.

input {
  redis {
    host => "localhost"
    data_type => "list"
    key => "logstash"
   }

}
output {
elasticsearch {
   hosts => ["xxx.xx.xx.xx:8008"]
   index => "%{Application}-%{+YYYY.MM.dd}"
   document_type => "%{[@metadata][type]}"
   }
}

it shows me exception in ES.
org.elasticsearch.indices.InvalidIndexNameException: [%{Application}-2016.04.28] Invalid index name [%{Application}-2016.04.28], must be lowercase

Event processing pipeline is FB-LS-Shipper->Redis->LS-Indexer->nGinx-->ES
When I connect FB directly to Indexer, then it works fine.

Please tell me whats wrong with above configuration.

br,
Sunil

Hi,
I have also tried below code in indexer, still its says index name must be lower case.

output {
elasticsearch {
hosts => ["xxx.xx.xx.xx:8008"]
index => "%{[fields][Application]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}

Why does it works well when I bypass LS-SHieppr and nginx. and it works when I directly send logs to LS-indexer.
Is there anything extra happens with medata and fields when LS-Shiper sends the data to redis?

br,
Sunil

Replace the elasticsearch output with a stdout { codec => rubydebug { metadata => true } } output so that you can see exactly what the events look like. Perhaps you somewhere use a plain codec instead of a json codec.

Hi Mgnus,
I will do that change and let you know.
Can you please explain more below statement?

If the codec settings of inputs and outputs don't match you can get unwanted results. For example, if the output sends the JSON string

{"message": "this is a message", "@timestamp": "2016-04-29T08:10:00.000Z"}

but the receiving input uses a plain string this won't get deserialized into an event with message and @timestamp fields. You'll get an event with a message field that contains '{"message": "this is a message", "@timestamp": "2016-04-29T08:10:00.000Z"}'.

I just saw output at stout

{
"@timestamp" => "2016-04-29T06:12:41.710Z",
"message" => "{"message":"Apr 28 13:04:20 hostname.fi SYSLOG 20727 - - INFO;2016-04-28T16:04:20.128+0300;AM-Process;-;Transactions processing has been initialized;-;-;-;-;transactionProcessorFactory.js;-;-","@version":"1","@timestamp":"2016-04-29T06:09:55.990Z","application":"tf","beat":{"hostname":"hostname.lij.fi","name":"hostname.lij.fi"},"count":1,"input_type":"log","offset":604095170,"source":"/var/log/messages","type":"syslogs","host":"hostname.lij.fi","tags":["> beats_input_codec_plain_applied"]}",
"@version" => "1",
"eventLogTime" => "2016-04-28T13:04:20.128Z"
}

It says "beats_input_codec_plain_applied"
Where does it come from?
br,
Sunil

This confirms my hypothesis. One of your inputs (maybe the redis one?) should use a json input but currently doesn't.

Hi,
Thanks.
OK I will try to find out this setting.
If you have any hint, please share.

is there any possibility of something wrong at shipper side.

Note: I had old LS-1.5.5. Now I have upgraded to LS 2.2.0 using rpm -Uvh LS-2.2x.rpm .

br,
Sunil

Hello,
I tried codec => "json" and json_lines in both shiper and indexer , but it didn't resolve my problem. :frowning:

br,
Sunil

Exactly where did you set the codec option? Exactly what do the messages stored in Redis look like?

Hi MAgnus,

I have 3 places where I tried.

In LS-Shipper:
Input{
beats {
....

}
}
And output redis {

}

In LS-Indexer:

Input {
redis {

}
}
Anything wrong in this?

br,
Sunil

Hi Magnus,

there was multiline codec in input redis in LS-Shipper. When I removed that it started working.

But now, I have no way to use multiline codec.
Multiline filter is deprecated in LS2.2
Multliline error stacktraces are splitting in different rows on KIbana. :frowning:

br,
Sunil

why not use multiline support in filebeat?