Tried two things:
- use an index name, where the above index pattern is specified (@timestamp = format epoch_milis)
This seems not to work
[2020-02-13T13:40:37,604][INFO ][logstash.outputs.elasticsearch][sagsys-vpos] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001,
"settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string",
"mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>
{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"ty
pe"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-02-13T13:40:37,605][DEBUG][org.logstash.config.ir.CompiledPipeline][sagsys-vpos] Compiled filter
P[filter-ruby{"code"=>"event.set(\"epoch\", event.get(\"date_time\").to_i)"}|[str]pipeline:26:5:```
ruby {
code => 'event.set("epoch", event.get("date_time").to_i)'
}
```]
into
org.logstash.config.ir.compiler.ComputeStepSyntaxElement@311877f2
[2020-02-13T13:40:37,612][INFO ][logstash.javapipeline ][sagsys-vpos] Pipeline started {"pipeline.id"=>"sagsys-vpos"}
[2020-02-13T13:40:37,616][DEBUG][logstash.outputs.elasticsearch][sagsys-vpos] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2020-02-13T13:40:37,615][DEBUG][org.logstash.execution.PeriodicFlush][sagsys-vpos] Pushing flush onto pipeline.
[2020-02-13T13:40:37,625][DEBUG][logstash.javapipeline ] Pipeline started successfully {:pipeline_id=>"sagsys-vpos", :thread=>"#<Thread:0x7abc857 sleep>"}
[2020-02-13T13:40:39,470][WARN ][logstash.outputs.elasticsearch][sagsys-vpos] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"connect-sagsys-vpos", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x7037a657>], :response=>{"index"=>{"_index"=>"connect-sagsys-vpos", "_type"=>"_doc", "_id"=>"36-QPnABLMeXi7w5v7SJ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [@timestamp] of type [date] in document with id '36-QPnABLMeXi7w5v7SJ'. Preview of field's value: '2020-02-13T12:40:38.156Z'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2020-02-13T12:40:38.156Z] with format [epoch_second]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}
[2020-02-13T13:40:39,472][WARN ][logstash.outputs.elasticsearch][sagsys-vpos] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"connect-sagsys-vpos", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x7f1ec3f8>], :response=>{"index"=>{"_index"=>"connect-sagsys-vpos", "_type"=>"_doc", "_id"=>"4K-QPnABLMeXi7w5v7SJ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [@timestamp] of type [date] in document with id '4K-QPnABLMeXi7w5v7SJ'. Preview of field's value: '2020-02-13T12:40:38.160Z'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2020-02-13T12:40:38.160Z] with format [epoch_second]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}
- Use an index which has no index template specified
This works.