Parsing error in logstash logs


#1

logstash gives following error in it's logs:

[2018-06-06T10:55:32,611][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"abcd-2018.06.06", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x3700c109], :response=>{"index"=>{"_index"=>"abcd-2018.06.06", "_type"=>"doc", "_id"=>"XXSO02MBfASZiwKaqhNz", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [end_time]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: "07:00:49""}}}}}

Logs I am trying to parse are:
type -1
2018-06-05 21:00:53 INFO Start:785 - Start Time in hh:mm:ss: 09:00:12 Status:success Quote No:1234 Endtime: 09:00:53 Total time in seconds:41
type-2
2018-06-05 23:00:30 ERROR Start:491 - Start Time in hh:mm:ss: 11:00:11 Status:Failure Quote No:null Exception:Element is not clickable at point (224.5,202) because another element

obscures it

My .conf file looks like:

input {
beats
{port => "xyz"}
}

filter {

    if ("INFO" in [message])
    {
            grok
            {
            match => { "message" => "%{TIMESTAMP_ISO8601:time_stamp} %{WORD:info}  Start:%{NUMBER:snum} - Start Time in hh:mm:ss: %{TIME:start_time} Status:%{WORD:status}  Quote No:%{NUMBER:q_no} Endtime: %{TIME:end_time} Total time in seconds:%{NUMBER:total_time}" }
            remove_field => ["message"]
            }
            date
            {
    match => [ "time_stamp" , "YYYY-MM-dd HH:mm:ss Z" ]
            }
            }

    else if ("ERROR" in [message])
    {
            grok
            {
            match => { "message" => "%{TIMESTAMP_ISO8601:time_stamp} %{WORD:info} Start:%{NUMBER:snum} - Start Time in hh:mm:ss: %{TIME:start_time} Status:%{WORD:status}  Quote No:%{WORD:q_no}  Exception:%{GREEDYDATA:exception}" }
            remove_field => ["message"]
            }

            date
            {
    match => [ "time_stamp" , "YYYY-MM-dd HH:mm:ss Z" ]
            }

     }
    }

output
{

    elasticsearch {
            hosts => [ "elkmonp3.newindia.co.in:9200" ]
            index => "abcd-%{+YYYY.MM.dd}"
            user => abc
            password => abc1234
    }

}

logstash Version : logstash-6.2.2


(Magnus Bäck) #2

What's the mapping of the end_time field? Use ES's get mapping API.


#3

Sorry, for the late reply.This setting is somehow running smoothly now, and not giving any error since last 2 days and generating the index. mapping API looks like.
{"mappings":{"doc":{"dynamic_templates":[{"message_field":{"path_match":"message","match_mapping_type":"string","mapping":{"norms":false,"type":"text"}}},{"string_fields":{"match":"","match_mapping_type":"string","mapping":{"fields":{"keyword":{"ignore_above":256,"type":"keyword"}},"norms":false,"type":"text"}}}],"properties":{"@time_stamp":{"type":"date"},"@timestamp":{"type":"date"},"@version":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"beat":{"properties":{"hostname":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"name":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"version":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}}}},"end_time":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"exception":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"host":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"houre":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"info":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"minutee":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"offset":{"type":"long"},"prospector":{"properties":{"type":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}}}},"q_no":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"seconde":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"snum":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"source":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"start_time":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"status":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"tags":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"time_stamp":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}},"total_time":{"type":"text","norms":false,"fields":{"keyword":{"type":"keyword","ignore_above":256}}}}},"default":{"dynamic_templates":[{"message_field":{"path_match":"message","match_mapping_type":"string","mapping":{"norms":false,"type":"text"}}},{"string_fields":{"match":"","match_mapping_type":"string","mapping":{"fields":{"keyword":{"ignore_above":256,"type":"keyword"}},"norms":false,"type":"text"}}}],"properties":{"@time_stamp":{"type":"date"}}}}}}

But, I would like to know why that error was occurring before and not now with same configuration. With that error ES was not able to generate any index.
Thank you for your help.


(Magnus Bäck) #4

But, I would like to know why that error was occurring before and not now with same configuration.

For some reason the end_time field at some point contained a value that ES's automapper interpreted as something else than a string. Perhaps you changed what the end_time values look like, making new end_time values incompatible with the original field mapping?


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.