Problem with logstash passing files to ES

Hi, i'm having an issue with logstash.

I'm getting logs files from a Filebeat and doing some grok pattern for every line of the log (tested on and working well for the grok part). But it seems that when the grok is
done and the file should be transfered to elasticsearch, it can't reach it.

Here is my error (i don't understand the full error) :

[2017-11-28T07:58:37,139][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-wowza-2017.11.28", :_type=>"logs", :_routing=>nil}, 2017-11-28T12:57:51.714Z %{host} 2017-04-08 00:25:26 CEST connect session INFO 200 - defaultVHost event1 definst 0.001 [any] 1935 rtmp:// rtmp - LNX 9,0,124,2 50068641 3291 3073 - - - - - - - - - - - - - rtmp:// -], :response=>{"index"=>{"_index"=>"logstash-wowza-2017.11.28", "_type"=>"logs", "_id"=>"AWACtSXoIpEoNGxsvrbO", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [program] of different type, current_type [text], merged_type [date]"}}}}

If anyone has a clue ?

Thx in advance.

The program field has been mapped as a date and now you're trying to shove non-text data into it. What type should it be?

it should be a date, i need a date to parse this :

date {
      match => [ "timestamp", "YYYY-MM-dd HH:mm:ss" ]

is this possible to have a "date" type, by taking the info with a regex ?
=> will the type be considered as a date or as a string ?

What does the timestamp field have to do with the program field?

If program indeed should contain a date make sure you don't send non-date values its way.

I have a regex in my grok pattern to take a date :
2017-11-03 16:48:20 (it has a tab between the 2 values)
after that i do a gsub to replace the tab by a space
and then i try to match the result with a date :

  mutate {
      gsub => [ "timestamp", "[\t]", " "]
  date {
      match => [ "timestamp", "YYYY-MM-dd HH:mm:ss" ]

and for the program field, I don't need it (i don't even know why there is one because i didn't put one in my grok pattern)

If you don't want the field then just remove it.


but my prob still is the same
the @timestamp value, isn't matched with my custom timestamp field because my custom field is defined as a "string" type, and i need to change it to a "date" type.

is there a way to force the type of my custom field ?

Okay, but that's a completely different problem.

Why are you saving both @timestamp and timestamp? If you want timestamp to be a date (that matches @timestamp) just use the date filter to parse it but set the target option to store the result in timestamp.

i don't know if i need both,

I just know that, @timestamp is the date when the log was ingested in logstash, and my custom timestamp is the date for each line of my logs (and they are not the same)
and to do visualization with kibana i need to use @timestamp as a date histogram (because custom timestamp is defined as a string type and it need a date type to work)

that's why i try to make my custom timestamp replace the value in @timestamp

Show an example document. Copy/paste from Kibana's JSON tab or use a stdout { codec => rubydebug } output.

here is the json (i cleaned it a little)

      "_index": "logstash-wowza-2017.11.24",
      "_type": "logs",
      "_id": "AWADM4lKIpEoNGxs-kPB",
      "_score": null,
      "_source": {
        "xstreamid": "1",
        "xseverity": "INFO",
        "tz": "CET",
        "source": "/usr/local/WowzaStreamingEngine-4.0.1/logs/wowzastreamingengine_access.log.2017-11-24",
        "csstreambytes": "5525048467",
        "timestamp": "2017-11-24 18:00:49",
        "tags": [
        "@timestamp": "2017-11-24T23:00:49.000Z",
        "fields": {
          "application": "wowza"
      "fields": {
        "@timestamp": [
      "sort": [

If your timezone is UTC-0500 then things are as they should be since @timestamp is always UTC. This is not configurable.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.