Logsatsh not stashing a huge file

I have a file of ~ 23gb, I am using filebeat for the logstash input, but Logstash only parses a few lines from the start of the log file, what should I do?

1 - Say hello
2 - Explain a bit more your problem ..
3 - Show your configs file ..

It's a good start. :slight_smile:

1 Like

Hello I am Sana From NUST.
Newbie to ELK Stack.

I want to parse a data of ~23 GB file. For few lines of code my Code worked fine. But for 23 gb File, it got stucked in between.

what are the checks for logstash to deal with such a huge file.

What does your config look like? What does data in the file look like?

log lines : 90-50-31: ill ill you are You
log lines: 90-50-31: ill till you are You
log lines : 90-50-31: ill end you are You

input {
beats {
port => 5044
}
}
if [ty] =~ "YOU" or "you" {
mutate {
gsub => [ "field", "[a-zA-Z]+" , "" ]
}
} else {
mutate { gsub => [ "field", "[0-9]+" , "" ] }
}

output {
elasticsearch {
hosts => "http://localhost:9200"

}
stdout {
codec => rubydebug

}
}

Do you have 23GB of log lines like that? What does your beats config look like?

yes logs have almost the same format,

beats
filebeat.prospectors:

  • type: log
    enabled: true
    paths:
    • /home/matrix/mylogs.log
      output.logstash:
      hosts: ["localhost:5044"]

This question can be rephrased as how to deal with huge log files using ELK stack?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.