Hi,
I am using FB to send logs to LS-Shipper.
Event processing pipeline is FB-LS-Shipper->Redis->LS-Indexer->nGinx-->ES
Below is FB configuration.
Note: I have created one field Application with 'A' in uppercase.
filebeat:
# List of prospectors to fetch data.
prospectors:
-
paths:
- /var/log/messages
input_type: syslogs
fields:
Application: tf
fields_under_root: true
document_type: syslogs
registry_file: /var/lib/filebeat/registry
output:
### Logstash as output
logstash:
# The Logstash hosts
hosts: ["shipper:5000"]
# Optional TLS. By default is off.
#tls:
# List Iof root certificates for HTTPS server verifications
#certificate_authorities: ["/opt/logstash-forwarder/SSL/abc-issuing.cer.pem"]
logging:
to_files: true
# To enable logging to files, to_files option has to be set to true
files:
# The directory where the log files will written to.
path: /var/log/filebeat
# The name of the files where the logs are written to.
name: filebeat
# Configure log file size limit. If limit is reached, log file will be
# automatically rotated
rotateeverybytes: 10485760 # = 10MB
level: debug
I want to create Index %{Application}-YYYY-MM-DD so that each application logs will go in its own index.
For this, I have done LS-Indexer configuration as below.
it shows me exception in ES. org.elasticsearch.indices.InvalidIndexNameException: [%{Application}-2016.04.28] Invalid index name [%{Application}-2016.04.28], must be lowercase
Event processing pipeline is FB-LS-Shipper->Redis->LS-Indexer->nGinx-->ES
When I connect FB directly to Indexer, then it works fine.
Please tell me whats wrong with above configuration.
Why does it works well when I bypass LS-SHieppr and nginx. and it works when I directly send logs to LS-indexer.
Is there anything extra happens with medata and fields when LS-Shiper sends the data to redis?
Replace the elasticsearch output with a stdout { codec => rubydebug { metadata => true } } output so that you can see exactly what the events look like. Perhaps you somewhere use a plain codec instead of a json codec.
If the codec settings of inputs and outputs don't match you can get unwanted results. For example, if the output sends the JSON string
{"message": "this is a message", "@timestamp": "2016-04-29T08:10:00.000Z"}
but the receiving input uses a plain string this won't get deserialized into an event with message and @timestamp fields. You'll get an event with a message field that contains '{"message": "this is a message", "@timestamp": "2016-04-29T08:10:00.000Z"}'.
there was multiline codec in input redis in LS-Shipper. When I removed that it started working.
But now, I have no way to use multiline codec.
Multiline filter is deprecated in LS2.2
Multliline error stacktraces are splitting in different rows on KIbana.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.