Logstash taking too long to run

Hello everyone,

I started to index aprox. 1.000.000 json files into elastic seach using logstash.

The logstash config contains some lowercase, renaming operations and if-else statements.

I`m trying to index the data using 16 threads and batch size = 256.

The problem is that if I look in Kibana, at Index Management, I can`t see the index created yet and the logstas is running since 3 hours ago

From what I know, logstash should start the documents indexing after finnishing each batch. Is my understanding wrong?

Also, I tried with about 30k documents and everything was fine.

I want to know if something is wrong with my conf file/elastic search instance or it`s just normal to take that much.

Thank you!

it is not normal. index will be created as soon as first batch is arrived via logstash.
I have many pipleline running and when I introduce new pipeline index gets created right away.

7.6.1 had some problem I believe where in some cases it wasn't creating index.

It is pretty strange, because with less documents, it works as expected.

It could be a problem if I have close order set with 1, or maybe the fact that sincedb_path is /dev/null?

file {
        codec => json
        path => "user/documents/*.json"
        start_position => "beginning"
        sincedb_path => "/dev/null"
        close_older => 1
    }

Also, I checked again and the entire stack is 7.6.2

Besides all that, if I check de log file, I don`t see any error/warning.

I am not using that close_older
because logstash will remove file which is already read.
don't you need '/' in front of user

input {
file {
path => "/logstash/csv_files/login_stats/_hourly.csv"
start_position => "beginning"
mode => "read"
sincedb_path => "/dev/null"
codec => plain { charset => "Windows-1252" }
}
}

Yes, I was using close_older because firstly I ran into a problem regarding having way too many files open in the same time.
Do you think that is the problem ?
Regarding the path, I was providing a dummy path here, but the path is good.

how many files we are talking about?
check /etc/logstash/startup.options file

any message in your logstash log file?

For now, is about 1 milion json files.
No error message in the log file.

ok. then it might be hitting some other limit.
but from configuration stand point it looks ok.
can you try some kind of wild card match in like ab.json and test.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.