Hi
i'm trying to set up a centralized syslog for multiple log sources
so i have a logstash that has two separate inputs and two separate outputs
however for some reason the data from one of the inputs ends up in both indexes
what am i doing wrong?
below both pipelines' configs
input{
tcp {
port => 5052
codec => "json_lines"
}
}
output {
elasticsearch {
hosts => "10.50.6.116"
index => "remote"
}
file {
path => "/var/log/logstash/remote-tcp.log"
}
stdout { codec => rubydebug }
}
input {
file {
path => "/data/vmlist/*.csv"
start_position => "beginning"
sincedb_path => "/tmp/sincedb"
}
}
filter {
csv {
separator => ","
columns => ["VM Name","Creation Date","Owner","Type","Message"]
}
}
output {
elasticsearch {
hosts => "http://10.50.6.116:9200"
index => "vms"
document_type => "csv"
}
stdout{ codec=> rubydebug}
}