Data mixed between indices

I have a pretty simple setup. ES 2.2 and logstash 2.2.0.
I have several conf for logstash, each one defines a port to listen to, labels data with "type=>xyz" and puts in an index with the same name and date time reference 'index => "xyz-%{+YYYY.MM.dd}"'.
Index are created with no problems. The real problem is that I find data with a type=xyz in an index named "abc" that was meant for data of type abc that comes on a specific port.
So it was meant to be:
port 1234 --> type=abc --> index=abc-yyyy.mm.ss
port 1235 --> type=xyz --> index=xyz-yyyy.mm.ss
I search the data using kibana and indexes are created using abc-* and xyz-*, but using old syntax yyymmdd is the same. When I explode found records I found _index:abc and type:xyz which is impossible in my idea.
When I have to find some data I must try on each index in kibana and I found it in any index...
What am I doing wrong?
Some files are as simple as being collectd five-liners...

Here is data from syslog collector (listening on udp 514) that went into the collectd index (test-cd) listening on 25826.
Thanks

What does you Logstash config look like?

This is the syslog one

input {
 syslog {
  type => syslog
 }
}


output {
 elasticsearch {
  hosts => localhost
  index => "syslog-%{+YYYY.MM.dd}"
 }

 # stdout { codec => rubydebug }
}

This is the collectd one

input {
 udp {
port => 25826
buffer_size => 1452
codec => collectd {}
type =>collectd
 }
}

output {
        elasticsearch {
                hosts => localhost
                action => "index"
                index => "test-cd"

        }
        # stdout { codec => rubydebug }
}

1 Like

If you have multiple outputs and do not use conditionals in your configuration, Logstash will send each event to all outputs.

Maybe I'm missing something but the doc you refer explains conditionals as "if then else" things.
There are no conditionals in my config files since there is a single output for each one.
The formatting was not clear, but these are two SEPARATE files, one for syslog and one for collectd, each started in the conf.d directory using the daemon.
The only double output I see is the (commented) console rubydebug line.

Ok, maybe I got something more.
Are you saying that if I put 10 .conf files in the conf.d directory they are considered as a big one? In that case I need to put a conditional in front of each output like
if type1 then output 1
if type2 then output 2
I thought that each file lived in its own "domain". I saw several examples of different files run by the daemon and noone noted cross inputs.

The syslog file above should be written as

output {

if [type]=="syslog" {
 elasticsearch {
  hosts => localhost
  index => "syslog-%{+YYYY.MM.dd}"
 }
}

So that if it is run as a single file it behaves normally, and when run with other files it will output only "its own" events. Is this right?

Yes, that is correct. Logstash will read all config files in the directory and basically concatenate them, which is why conditionals in output and filter blocks are essential when modularising configuration.

1 Like

Thank you, I totally misunderstood the meaning of having different files. I'm rewriting my conditions right now.

And of course it is working as expected now. Thank you.