ISO8601 to epoch seconds

Hi

I ingest data from a mysql database and in a specific column there are date/time in ISO8601 format stored, which I would like to convert as @timestamp in elasticsearch.

I followed the question here: Convert @timestamp to epoch
Unfortunaltely it didn't work.

Filter section of my logstash pipeline looks as follows:

filter {
        remove_field => [ "@timestamp" ]
    }
    ruby {
        code => "event.set('@timestamp', event.get('date_time').to_i)"
    }
}

But I always get following log entries:

[2020-02-12T16:20:04,562][ERROR][logstash.filters.ruby ][sagsys-vpos] Ruby exception occurred: wrong argument type Integer (expected LogStash::Timestamp)

How can I get rid of the expection?

Use a date filter instead of a ruby filter.

I already tried:

date {
    match => [ "date_time" , "ISO8601" ]
    target => "@timestamp"
    remove_field => [ "date_time" ]
}

error message:

[2020-02-12T17:57:39,909][WARN ][logstash.outputs.elasticsearch][sagsys-vpos] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"sagsys-vpos", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x6968f32c>], :response=>{"index"=>{"_index"=>"connect-sagsys-vpos", "_type"=>"_doc", "_id"=>"kDdVOnABi2wWiksar-uX", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [@timestamp] of type [date] in document with id 'kDdVOnABi2wWiksar-uX'. Preview of field's value: '2020-02-12T16:57:37.874Z'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2020-02-12T16:57:37.874Z] with format [epoch_second]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}

OK, so you must have an index template that says your @timestamp is formatted as epoch_second. That's not what logstash is going to send. so elasticsearch is not going to be able to parse what it receives.

This is also already specified. The @timestamp field as epoch_seconds in the index template.

"mappings" : {
      "properties" : {
        "@timestamp" : {
          "format" : "epoch_second",
          "type" : "date"
        }
      }
    }

Hi,

Can you please try the below approach .

date {
match => ["date_time", "ISO8601"]
target => "date_time"
}

ruby{
code =>
'
event.set("@timestamp", event.get("date_time").to_i)
'
}

Thanks for the hint Sukanya.
I'll tried it without success:

Two error messages in logstash-plain:

[2020-02-13T11:44:28,206][WARN ][logstash.outputs.elasticsearch][sagsys-vpos] Could not index event to Elasticsearch. {:status=>400, :action=>["inde
x", {:_id=>nil, :_index=>"connect-sagsys-vpos", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x3a83e54b>], :response=>{"index"=>{"_index"=>"con
nect-sagsys-vpos", "_type"=>"_doc", "_id"=>"Z5ImPnABLMeXi7w5YBFU", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to
 parse field [@timestamp] of type [date] in document with id 'Z5ImPnABLMeXi7w5YBFU'. Preview of field's value: '2020-02-13T10:44:26.396Z'", "caused_
by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2020-02-13T10:44:26.396Z] with format [epoch_second]", "caused_by
"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}

[2020-02-13T11:44:28,228][ERROR][logstash.filters.ruby ][sagsys-vpos] Ruby exception occurred: wrong argument type Integer (expected LogStash::Timestamp)

can you please give me the output of this:

ruby{
code =>
'
event.set("epoch", event.get("date_time").to_i)
'
}

Just to check whether conversion is happening fine or not.

Thanks

Tried two things:

  1. use an index name, where the above index pattern is specified (@timestamp = format epoch_milis)

This seems not to work

[2020-02-13T13:40:37,604][INFO ][logstash.outputs.elasticsearch][sagsys-vpos] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001,
"settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string",
 "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>
{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"ty
pe"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-02-13T13:40:37,605][DEBUG][org.logstash.config.ir.CompiledPipeline][sagsys-vpos] Compiled filter
 P[filter-ruby{"code"=>"event.set(\"epoch\", event.get(\"date_time\").to_i)"}|[str]pipeline:26:5:```
ruby {
        code => 'event.set("epoch", event.get("date_time").to_i)'
    }
```]
 into
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@311877f2
[2020-02-13T13:40:37,612][INFO ][logstash.javapipeline    ][sagsys-vpos] Pipeline started {"pipeline.id"=>"sagsys-vpos"}
[2020-02-13T13:40:37,616][DEBUG][logstash.outputs.elasticsearch][sagsys-vpos] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2020-02-13T13:40:37,615][DEBUG][org.logstash.execution.PeriodicFlush][sagsys-vpos] Pushing flush onto pipeline.
[2020-02-13T13:40:37,625][DEBUG][logstash.javapipeline    ] Pipeline started successfully {:pipeline_id=>"sagsys-vpos", :thread=>"#<Thread:0x7abc857 sleep>"}


[2020-02-13T13:40:39,470][WARN ][logstash.outputs.elasticsearch][sagsys-vpos] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"connect-sagsys-vpos", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x7037a657>], :response=>{"index"=>{"_index"=>"connect-sagsys-vpos", "_type"=>"_doc", "_id"=>"36-QPnABLMeXi7w5v7SJ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [@timestamp] of type [date] in document with id '36-QPnABLMeXi7w5v7SJ'. Preview of field's value: '2020-02-13T12:40:38.156Z'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2020-02-13T12:40:38.156Z] with format [epoch_second]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}
[2020-02-13T13:40:39,472][WARN ][logstash.outputs.elasticsearch][sagsys-vpos] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"connect-sagsys-vpos", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x7f1ec3f8>], :response=>{"index"=>{"_index"=>"connect-sagsys-vpos", "_type"=>"_doc", "_id"=>"4K-QPnABLMeXi7w5v7SJ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [@timestamp] of type [date] in document with id '4K-QPnABLMeXi7w5v7SJ'. Preview of field's value: '2020-02-13T12:40:38.160Z'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2020-02-13T12:40:38.160Z] with format [epoch_second]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}
  1. Use an index which has no index template specified

This works.

image

OK, so that's the problem. Remove it.

Unfortunately I can't.
elasticsearch and logstash should be able to handle this situation...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.