LogStash sends data directly to ES bypassing Graylog2 server?

Hi All,

I ran a crash test on the LogStash +Redis + Graylog + ES ecosystem and am
a bit confused. I have a setup in which logstash agents send data to Redis
Db, after which it goes to a central logstash instance and then to ES via
Graylog.
So, I first killed the central logstash instance, so the data began to
accumulate in Redis. Then, I brought it up, but killed Graylog server. But
even then I observed that the data in Redis DB was emptied without any
problems! I know that my central logstash instance sends data in GELF to
ES, but it does so via Graylog server right? So if the Graylog server was
off, how was it able to connect to ES? I did netstat for port 12201, but it
didnt throw up any results. So how and where was the data emptied? If there
is no one listening at 12201, where did the Logstash send the data from
Redis to?? Or am I going wrong somewhere?
Because we say to Logstash in its conf files that :

output
{
gelf {host => a.b.c.d} and the port is 12201 by default.
So here the host is where the Graylog server is listening on 12201 right?
If so, how come was the Graylog server off, no one listening on 12201 and
the data was transferred?

I know that ES is the message store for Graylog, but shouldn't the graylog
server be up for it to function ?

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Why do you have a Graylog server in there at all?
Logstash certainly can write directly to ES - and is probably what you want
so that Kibana can read it.
If you really want Graylog working then have a look at the output
configuration of your logstash server and make sure that it is actually
going ot Graylog.

Cheers,
Edward

On Sunday, June 2, 2013 11:47:37 AM UTC-7, Kaushal wrote:

Hi All,

I ran a crash test on the LogStash +Redis + Graylog + ES ecosystem and am
a bit confused. I have a setup in which logstash agents send data to Redis
Db, after which it goes to a central logstash instance and then to ES via
Graylog.
So, I first killed the central logstash instance, so the data began to
accumulate in Redis. Then, I brought it up, but killed Graylog server. But
even then I observed that the data in Redis DB was emptied without any
problems! I know that my central logstash instance sends data in GELF to
ES, but it does so via Graylog server right? So if the Graylog server was
off, how was it able to connect to ES? I did netstat for port 12201, but it
didnt throw up any results. So how and where was the data emptied? If there
is no one listening at 12201, where did the Logstash send the data from
Redis to?? Or am I going wrong somewhere?
Because we say to Logstash in its conf files that :

output
{
gelf {host => a.b.c.d} and the port is 12201 by default.
So here the host is where the Graylog server is listening on 12201 right?
If so, how come was the Graylog server off, no one listening on 12201 and
the data was transferred?

I know that ES is the message store for Graylog, but shouldn't the graylog
server be up for it to function ?

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.