Hi all. I have an odd problem and I am not sure what is happening. Not sure if this should go into the ES discussions or LS.
I am trying to get some comma separated values into ES via Logstash.
My initial input is a csv file with 4270 lines in it. Looks like this:
09/22/2017,0,0,2,0,1,8183
09/23/2017,0,0,0,2,3,8179
09/24/2017,0,0,6,0,0,8185
Numbers correspond to a date some error numbers and counts of users added and removed from a user database.
I managed to create a pipeline to pump this info into ES and I can see them there in all their glorious searchability. No grok errors or anything.
Now to get this info updated daily, I want to get the user data base server to send this info to logstash via a tcp input. I created a new pipeline file with the tcp input, restarted logstash and I was able to use telnet to connect and add new lines.
However, now I am seeing the index file growing without any input to it. It is growing by about 1GB per hour. I've run a packet capture on the NIC and I am not seeing any traffic to it.
I just deleted the index and it has grown to 750 MB in the last 45 minutes or so. After deleting the index, I can't search for any data, so it looks like it is just filling this with white space.
Here are my two pipelines. First is the initial one I used to input the flat file. This was done locally on the ELK server.
Second is the tcp pipeline.
input {
file {
path => "/opt/ldap-file/ldap-results.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => [ "DATE","Error","Err-","Add","Change","Delete","Total" ]
}
date {
match => ["DATE", "MM/dd/yyyy"]
}
mutate { convert => ["Add", "integer"] }
mutate { convert => ["Change", "integer"] }
mutate { convert => ["Delete", "integer"] }
mutate { convert => ["Total", "integer"] }
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "ldap-users"
}
stdout {}
}
&&&&&&&&&&&&&&&&
input {
tcp {
port => 5002
}
}
filter {
csv {
separator => ","
columns => [ "DATE","Error","Err-","Add","Change","Delete","Total" ]
}
date {
match => ["DATE", "MM/dd/yyyy"] }
mutate { convert => ["Add", "integer"] }
mutate { convert => ["Change", "integer"] }
mutate { convert => ["Delete", "integer"] }
mutate { convert => ["Total", "integer"] }
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "ldap-users"
}
}