I'm getting logs files from a Filebeat and doing some grok pattern for every line of the log (tested on http://grokconstructor.appspot.com and working well for the grok part). But it seems that when the grok is
done and the file should be transfered to elasticsearch, it can't reach it.
Here is my error (i don't understand the full error) :
[2017-11-28T07:58:37,139][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-wowza-2017.11.28", :_type=>"logs", :_routing=>nil}, 2017-11-28T12:57:51.714Z %{host} 2017-04-08 00:25:26 CEST connect session INFO 200 10.196.134.41 - defaultVHost event1 definst 0.001 [any] 1935 rtmp://10.196.134.103:1935/event1 10.196.134.41 rtmp - LNX 9,0,124,2 50068641 3291 3073 - - - - - - - - - - - - - rtmp://10.196.134.103:1935/event1 -], :response=>{"index"=>{"_index"=>"logstash-wowza-2017.11.28", "_type"=>"logs", "_id"=>"AWACtSXoIpEoNGxsvrbO", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [program] of different type, current_type [text], merged_type [date]"}}}}
I have a regex in my grok pattern to take a date : 2017-11-03 16:48:20 (it has a tab between the 2 values)
after that i do a gsub to replace the tab by a space
and then i try to match the result with a date :
mutate {
gsub => [ "timestamp", "[\t]", " "]
}
date {
match => [ "timestamp", "YYYY-MM-dd HH:mm:ss" ]
}
and for the program field, I don't need it (i don't even know why there is one because i didn't put one in my grok pattern)
but my prob still is the same
the @timestamp value, isn't matched with my custom timestamp field because my custom field is defined as a "string" type, and i need to change it to a "date" type.
is there a way to force the type of my custom field ?
Why are you saving both @timestamp and timestamp? If you want timestamp to be a date (that matches @timestamp) just use the date filter to parse it but set the target option to store the result in timestamp.
I just know that, @timestamp is the date when the log was ingested in logstash, and my custom timestamp is the date for each line of my logs (and they are not the same)
and to do visualization with kibana i need to use @timestamp as a date histogram (because custom timestamp is defined as a string type and it need a date type to work)
that's why i try to make my custom timestamp replace the value in @timestamp
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.