ELK error

Hi! I added a new log file to logstash (the following lines to logstash.conf :slight_smile:

Input section

input {
...

VirusTotal

file {
path => ["/data/vt/log.json"]
codec => json
type => "VirusTotal"
}
}

Filter Section

filter {
...

VirusTotal

if [type] == "VirusTotal" {
date {
match => [ "timestamp", "ISO8601" ]
}
}
...

if [type] == ... or [type] == "VirusTotal" {
mutate {
add_field => {
"ip_ext" => "{MY_EXTIP}" "ip_int" => "{MY_INTIP}"
"hostname" => "${MY_HOSTNAME}"
}
}
}

Output section

output {
elasticsearch {
hosts => ["elasticsearch:9200"]
}

Test of logsatsh.conf says everything is OK
But when I go to kibana Index Patterns -> Refresh field list fileds from my new log file appear in the list, but kibana shows error:

I don`t know how to solve this problem. Thank you in advance for your help!

Hi there, this sounds related to Kibana 5.1.2/ES 5.1.2 - Index patterns with more than 1000 fields. From that post:

The 413 (PAYLOAD TOO LARGE) response is actually most likely coming from Elasticsearch. Kibana is just surfacing the error it gets back.

It happens as part of a request Kibana is making to it, so it would be useful to see what that request looks like. If you open your browser's debugger, you can check the request in the network request. That might indicate why that request is so large... most likely it's related to the field count, but I couldn't tell you offhand why field count would be making a request to large.

Could you check the network requests that get logged when you refresh the field list and tell me what you see there?

And could you tell me which version of the Elastic Stack you're using?

Thanks,
CJ

Im sorry! It really shows the error "413 Request Entity Too Large". The problem is solved and it wasnt related at all to elk! Thank you very much for your help!

Ah, glad you were able to figure it out!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.