Hi, I have some plans about using Logstash + Elasticsearch + Kibana at my
company, and I have some questions.
First of all, if I want to use a central elasticsearch cluster for several
application logs, is there a way to define a schema per application? I
mean, if one app uses an index typed as String and another app uses the
same index as a number, that could be a problem, is that right? Is there a
way to handle it?
Second, we are planning to use one logstash and several logstash
forwarders, but, if the applications had different filters, is it a problem
to have a central logstash to handle all the logs and send to
elasticsearch? Should we use one logstash per application?
If you are using LS to parse things then you probably want to define your
values - ie string, int - in a grok, then output them into different
indices. This will keep them separate. You can then also add a mapping to
further enhance things.
In your case it makes sense to just have a forwarder send to a central LS
instance where the processing is done.
Hi, I have some plans about using Logstash + Elasticsearch + Kibana at my
company, and I have some questions.
First of all, if I want to use a central elasticsearch cluster for several
application logs, is there a way to define a schema per application? I
mean, if one app uses an index typed as String and another app uses the
same index as a number, that could be a problem, is that right? Is there a
way to handle it?
Second, we are planning to use one logstash and several logstash
forwarders, but, if the applications had different filters, is it a problem
to have a central logstash to handle all the logs and send to
elasticsearch? Should we use one logstash per application?
Actually, my whole log file is json format, and, I don't use any grok, I
just log it into the file and logstash gets the json and sends to
elasticsearch, indexing everything. In this case, what do you think I could
do?
Em domingo, 22 de março de 2015 19:40:07 UTC-3, Mark Walkom escreveu:
If you are using LS to parse things then you probably want to define your
values - ie string, int - in a grok, then output them into different
indices. This will keep them separate. You can then also add a mapping to
further enhance things.
In your case it makes sense to just have a forwarder send to a central LS
instance where the processing is done.
On 22 March 2015 at 14:15, Gabriel Francisco <gabfs...@gmail.com
<javascript:>> wrote:
Hi, I have some plans about using Logstash + Elasticsearch + Kibana at my
company, and I have some questions.
First of all, if I want to use a central elasticsearch cluster for
several application logs, is there a way to define a schema per
application? I mean, if one app uses an index typed as String and another
app uses the same index as a number, that could be a problem, is that
right? Is there a way to handle it?
Second, we are planning to use one logstash and several logstash
forwarders, but, if the applications had different filters, is it a problem
to have a central logstash to handle all the logs and send to
elasticsearch? Should we use one logstash per application?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.