Logstash multiple pipelines going into same index

Hi
i'm trying to set up a centralized syslog for multiple log sources

so i have a logstash that has two separate inputs and two separate outputs

however for some reason the data from one of the inputs ends up in both indexes

what am i doing wrong?

below both pipelines' configs

input{
tcp {
port => 5052
codec => "json_lines"
}
}

output {
elasticsearch {
hosts => "10.50.6.116"
index => "remote"
}

file {
path => "/var/log/logstash/remote-tcp.log"
}
stdout { codec => rubydebug }
}

input {
file {
path => "/data/vmlist/*.csv"
start_position => "beginning"
sincedb_path => "/tmp/sincedb"
}
}

filter {
csv {
separator => ","
columns => ["VM Name","Creation Date","Owner","Type","Message"]
}
}

output {
elasticsearch {
hosts => "http://10.50.6.116:9200"
index => "vms"
document_type => "csv"
}
stdout{ codec=> rubydebug}
}

Are you using pipelines.yml? If so please show it to us. If not, this would be expected. Configuration files are not self-contained. logstash concatentates all the configuration files from a directory, reads from all the inputs, puts the events through the filters in order, then sends the events to all of the outputs.

Reading this thread or this one might help you.

thanks, that was the issue

i followed some blog guide that didnt mention the pipelines.yml file

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.