Cannot change date format on @timestamp

Hello,

filebeat does not recognize the date format I have in my logs (issue opened on github):
"2019-03-16T12:15:58.420454+0000"

so I tried to specify the format using a template.json file:

{
  "mappings": {
    "doc": {
      "properties": {
        "@timestamp": { "type": "date", "format": "yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ" },
        "_@timestamp": { "type": "date", "format": "yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ" },
        "layer": { "type": "keyword" }, 
        "ip_addr": { "type": "ip" }, 
        "string": { "type": "text" }, 
        "service": { "type": "keyword" }, 
        "parent_span_id": { "index": "false", "type": "long" }, 
        "trace_type": { "type": "keyword" }, 
        "trace_id": { "type": "long" }, 
        "label": { "type": "keyword" }, 
        "ip_port": { "type": "long" }, 
        "instance": { "type": "keyword" }, 
        "host": {
          "properties": {
            "host": { "ignore_above": 1024, "type": "keyword" }
          }
        },
        "num": { "type": "keyword" }, 
        "end_time": { "type": "double" }, 
        "key": { "type": "keyword" }, 
        "error": { "type": "boolean" }, 
        "cancelled": { "type": "boolean" }, 
        "path": { "type": "text" }, 
        "span_id": { "index": "false", "type": "long" }, 
        "start_time": { "type": "double" }, 
        "op": { "type": "keyword" },
        "duration_ms": { "type": "long" }
      }
    }
  }, 
  "template": "app-traces-*", 
  "settings": { "index.refresh_interval": "30s" }
}

but this setting does nothing as I still have the error during a log parsing:

2019-03-16T12:16:02.979Z	ERROR	jsontransform/jsonhelper.go:53	JSON: Won't overwrite @timestamp because of parsing error: parsing time "2019-03-16T12:15:58.420454+0000" as "2006-01-02T15:04:05Z07:00": cannot parse "+0000" as "Z07:00"

I also tried to rename the field from the filebeat.yml file:

filebeat.prospectors:
- fields.document_type: doc
  fields_under_root: true
  json.keys_under_root: true
  json.overwrite_keys: true
  json.add_error_key: false
  input_type: log
  paths:
  - /var/log/app-traces/trace-*.log
  processors:
    - rename:
        fields:
          - from: "@timestamp"
            to: "_@timestamp"

output.elasticsearch:
  hosts:
  - 10.200.3.221
  - 10.200.3.220
  - 10.200.3.187
  - 10.200.1.76
  - 10.200.1.251
  - 10.200.1.89
  index: app-traces-%{+yyyy.MM.dd}

setup.template.enabled: true
setup.template.json.enabled: true
setup.template.json.name: app-traces
setup.template.json.path: /usr/share/app-tracer-tools/traces_mapping_template.json
setup.template.name: app-traces
setup.template.pattern: app-traces*
setup.template.fields: /etc/filebeat/fields.yml

processors:
- rename:
    fields:
      - from: "_@timestamp"
        to: "@timestamp"

Here is an example of the log file I have to parse:

{"host":"s3-ssl-conn-0.localdomain","service":"sfused","instance":"unconfigured","pid":31737,"trace_type":"op","trace_id":1452107967111228,"span_id":8505715073326365,"parent_span_id":210198511458314,"@timestamp":"2019-03-16T12:20:46.699229+0000","start_time":1552738846699.229,"end_time":1552738846705.233,"duration_ms":6.003906,"op":"service","layer":"workers_arc_sub","error":false,"cancelled":false,"tid":32505}

Problem persists.

The only way to make it work, is to disable the "json.keys_under_root" setting, but in that case, it prefixes all the fields with "json." (like json.@timestamp).
As I do not control the tools which use the data indexed in elasticsearch, I cannot change the field names ingested.

Is there something I can do?

To add more information,
it is also impossible to rename the '@timestamp' field (like "_trtimestamp") from the log file.

(tried with and without json.overwrite_keys)

2019-03-18T16:46:35.149Z	ERROR	jsontransform/jsonhelper.go:53	JSON: Won't overwrite @timestamp because of parsing error: parsing time "2019-03-18T16:46:2
8.886938+0000" as "2006-01-02T15:04:05Z07:00": cannot parse "+0000" as "Z07:00"
2019-03-18T16:46:35.149Z	DEBUG	[rename]	actions/rename.go:78	Failed to rename fields, revert to old event: could not fetch value for key: @times
tamp, Error: key not found
2019-03-18T16:46:35.149Z	DEBUG	[filter]	pipeline/processor.go:174	fail to apply processor client{rename=[{From:@timestamp To:_trtimestamp}]}:
 could not fetch value for key: @timestamp, Error: key not found
2019-03-18T16:46:35.149Z	DEBUG	[publish]	pipeline/processor.go:308	Publish event: {
  "@timestamp": "2019-03-18T16:46:35.149Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.6.2"
  },
  "trace_type": "op",
  "error": {
    "message": "@timestamp not overwritten (parse error on 2019-03-18T16:46:28.886938+0000)",
    "type": "json"
  },
  "source": "/var/log/app-traces/trace-sfused-2019-03-18_16h00-19183.log",
  "host": {
    "name": "s3-ssl-conn-0.localdomain"
  },
  "pid": 19183,
  "log": {
    "file": {
      "path": "/var/log/app-traces/trace-sfused-2019-03-18_16h00-19183.log"
    }
  },
  "layer": "workers_chord",
  "service": "sfused",
  "duration_ms": 0.326904,
  "parent_span_id": 6521332636548366,
  "instance": "unconfigured",
  "tid": 19284,
  "trace_id": 2184600356207134,
  "cancelled": false,
  "end_time": 1.552927588887265e+12,
  "span_id": 7453220767237796,
  "input": {
    "type": "log"
  },
  "offset": 0,
  "start_time": 1.552927588886938e+12,
  "op": "wait",
  "prospector": {
    "type": "log"
  },
  "traces": true,
  "beat": {
    "name": "s3-ssl-conn-0.localdomain",
    "hostname": "s3-ssl-conn-0.localdomain",
    "version": "6.6.2"
  }
}

I have no solution to handle my log files right now.
do you have any workaround? or a working configuration?

Hello,

another fail for me.
I tried to use a pre-processing pipeline .

If I rename "@timestamp" into something else, the "@timestamp" field disappear.
(I expected that filebeat create its own @timestamp so I wan work on the renamed field)

If I format the date, the @timestamp value from the log file is replaced by the filebeat time processing (I guess the message is pipelined several times)

I'm a bit discouraged but just in case (and to help other users), I put the 2 pipeline configs used:

filebeat.yml:

filebeat.inputs:
- type: log
  paths:
  - /var/log/app-traces/trace-*.log
  json.keys_under_root: true
  json.overwrite_keys: false
  json.add_error_key: false
  fields:
    traces: true
  fields_under_root: true
  pipeline: app-pipeline-tracer

output.elasticsearch:
  hosts:
  - 10.200.3.221
  index: app-traces-%{+yyyy.MM.dd}

setup.template.enabled: true
setup.template.json.enabled: true
setup.template.json.name: app-traces
setup.template.json.path: /usr/share/app-tracer-tools/traces_mapping_template.json
setup.template.name: app-traces
setup.template.pattern: app-traces*
curl -X DELETE "10.200.3.221:9200/_ingest/pipeline/app-pipeline-tracer"
curl -s -X PUT "10.200.3.221:9200/_ingest/pipeline/app-pipeline-tracer" -H 'Content-Type: application/json' -d'
{
  "description": "change date format for tracer",
  "version": 1,
  "processors": [
    {
      "date" : {
        "field" : "@timestamp",
        "target_field" : "@timestamp",
        "formats": ["yyyy-MM-dd'\''T'\''HH:mm:ss.SSSSSSZ", "yyyy-MM-dd'\''T'\''HH:mm:ss.SSSZZ"]
      }
    }
  ]
}
'
curl -X DELETE "10.200.3.221:9200/_ingest/pipeline/app-pipeline-tracer"
curl -s -X PUT "10.200.3.221:9200/_ingest/pipeline/app-pipeline-tracer" -H 'Content-Type: application/json' -d'
{
  "description": "change date format for tracer",
  "version": 1,
  "processors": [
    {
        "rename" : {
          "field" : "@timestamp",
          "target_field" : "@tr_timestamp"
        }
    }
  ]
}
'

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.