Difficulty parsing logs to ES from Logstash

Hi all,

We have a new setup of ELK 6.3.0, with the latest 6.3.0 of filebeat and logstash.

We are getting a lot of errors on logs transferring between logstash and ES.

I am unsure why this is, we have tried using the filebeat template loaded into ES via the filebeat export template, and also deleting all the templates. In both cases the logs are not loading into ES.

It appears that ES is expecting hostname: string, not hostname: name: string.

EXAMPLE ERROR
2018-06-20T09:59:40,784][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2018.06.20", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x2751b7f7>], :response=>{"index"=>{"_index"=>"filebeat-2018.06.20", "_type"=>"doc", "_id"=>"VIzsHWQBHoHS_AvaRHrY", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [host]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:303"}}}}}

EXAMPLE INPUT:

 "@timestamp": "2018-06-20T22:14:23.976Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.3.0",
    "pipeline": "filebeat-6.3.0-system-syslog-pipeline"
  },
  "beat": {
    "name": "ip-10-75-127-242",
    "hostname": "ip-10-75-127-242",
    "version": "6.3.0"
  },
  "host": {
    "name": "ip-10-75-127-242"
  },
  "message": "Jun 20 22:11:03 ip-10-75-127-242 kubelet[2155]: I0620 22:11:03.178104    2155 server.go:796] GET /metrics: (3.221285ms) 200 [[python-requests/2.19.1] 127.0.0.1:64936]",
  "source": "/var/log/syslog",
  "offset": 7358444,
  "prospector": {
    "type": "log"
  },
  "input": {
    "type": "log"
  },
  "fileset": {
    "module": "system",
    "name": "syslog"
  }
}

I have the same problem. Looks like this change is behind it. I'm thinking I'm going to mutate it in the short term and come up with a better solution later. It feels like the impact is very significant for existing deployments.

Jess, thanks for your reply. This has been wreaking havoc as some events come in with host: string, and others host: name: string.

Please share a mutate if you have one? I'm still learning grok.

This is what I used. I don't have a long term solution yet...

  #Rename some of the beats fields due to 6.3.0 beats changes
  mutate {
    rename => { "[host][name]" => "host" }
  }

Thanks Jess. I am going to use this for another parsing problem. For this issue, it made sense to add version to the index name to differentiate templates between 6.2 and 6.3. That fixed the issue for us.

index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"

That explains why I had the issue everywhere: I'm using the rollover API and my indexes are much larger.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.