Unable to index data coming from filebeat to elasticsearch

Hi,
The problem i am facing is as follows,
I developed a config file for the logstash and indexed the event to elastic and everything was working as expected, but when I added file beat as the input, I am unable to index the event into the elastic, because of the additional data from filebeat.

I am not interested in most of the data that is added by the file beat.
Note: If I output the data to stdout, it works as expected but when i try to send the same output to elastic I get the error mentioned below,

Also interested to know is there a way to supress all the data coming from filebeat?

{
                "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
           "log_level" => "INFO",
          "time_taken" => 0.247,
          "@timestamp" => 2020-05-13T15:28:55.000Z,
              "source" => "C:\\logs\\data_pipeline.log",
             "message" => "2020-05-13 17:28:55 # data_pipeline# INFO # 0.247 # seconds to update the chunks to oracle DB",
    "application_name" => "data_pipeline",
                 "msg" => "0.247 # seconds to update the chunks to oracle DB",
               "input" => {
        "type" => "log"
    },
                "host" => {
        "name" => "MYHOSTNAME"
    },
            "@version" => "1",
          "prospector" => {
        "type" => "log"
    },
              "offset" => 389198
}
[2020-05-13T17:29:08,722][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2020.05.13", :_type=>"doc", :routing=>nil}, #<LogStash::Even
t:0x9dce0e4>], :response=>{"index"=>{"_index"=>"logstash-2020.05.13", "_type"=>"doc", "_id"=>"iN2nDnIBp3N7K5qGXCBu", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [tex
t]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:501"}}}}}

Is there a question here?

1 Like