My first question on this forum, so apologies for missing to post any detail. If I can also take this opportunity to say a big hello to everyone from a newbie to the world of Elastic Stack!
I have a CSV file i am trying to send to logstash sourcing from both local input and via a filebeat - the exact same format. My filebeat config for local file input works. But the same file when coming in via filebeats (all v7.7) throws the error
error => "mapper_parsing_exception"
reason => "failed to parse field [host] of type [text] in documents with id 'blah blah'. Preview of field's value: '{name=my.host.name}'"
caused_by => "illegal state exception"
reason => "Can't get text on START_OBJECT at 1:974"
My logstash conf (conf.d/my-csv-data.conf)
input {
file {
start_position => "beginning"
path => "/what/ever/*.csv"
}
beats {
id => "some-id"
port => 5044
}
}
output {
elasticsearch {
hosts => ["localhosts:9200"]
action => "index"
index => "myindex"
}
}
The filebeat config is essentially a path added to the default filebeats.input.paths collecting *.csv and pointing to the above logstash host on port 5044. But as below, all the same:
filebeats,inputs:
- type: log
enabled: true
paths:
- /path/to/data/*.csv
tags: ["csv-type-x"]
- type: log
enabled: true
paths:
- /other/path/to/data/*.csv
tags: ["csv-type-y"]
# etc etc
output.logstash:
hosts: [ "logstashhost:5044"]
# and the rest of the default stuff here
The CSV fields do not appear to be the problem when the file is processed locally (the hostname is populated properly too). I am wondering what could be different about the same data coming in through filebeats?
Any hints would be really appreciated!