Logstash stops loading text file into ES


(ankur) #1

I have many csv files each of 350MB in size and each having 1 Million lines.
Data present in csv file is in json format
e.g
{"operatingsystem":"windows7","Arch":"64Bit","Program":"some.exe","IP":"0.0.0.0"}
{"operatingsystem":"windows8","Arch":"32Bit","Program":"some.exe","IP":"0.0.0.0"}
{"operatingsystem":"windows10","Arch":"64Bit","Program":"some.exe","IP":"0.0.0.0"}
{"operatingsystem":"windows7","Arch":"32Bit","Program":"some.exe","IP":"0.0.0.0"}

Logstash stops reading these csv files after a while (~2-3 Hrs, ~2-3 Gb of data).
Although no error comes from logstash or elasticssearch.

Config:

input {
file {
path=> "D:\somepath\*.csv"
position => "beginning"
}
}

filter {

json {
	source => "message"
}

geoip {
	......
}

date {
	.....
}

}

output {
elasticssearch {
.......
}
}

Logstash : 1.4, 1.5 (tried both)
elasticssearch : 1.6
Mapping : Dynamic
Memory : 4 Gb to ES, default to Logstash
Disk : HDD
CPU : i5, 4 cores
OS : Windows 8
Indexing Rate : ~500 Docs/sec


(Mark Walkom) #2

It might be painful but can you try with verbose or debug logging levels?


(system) #3