Solved - Duplicate field data problem and timestamp question

I had an invalid date format so it was unable to parse the date given the format I was applying.

Timestamp Question:
Given the configs and log messages below, how would I go about making the timestamp field in my log message replace the @timestamp field that ends up in ElasticSearch? I've tried everything I could find and didn't figure it out... plus somewhere along the way I caused the problem I'm about to describe.

Duplicate Field Data Problem: Edit - I found the duplicate data issue. It was pulling the same filter from a test file I was using. I was unaware that it would process files without a .conf extension as configuration files.

The message in the log file looks like this:

[2018-06-19 20:16:35.398] [webservicedev.ourplace.com] [http-nio-8586-exec-1] DEBUG c.w.controller.ItemBalanceController@getItemBalanceWithAdHocSql:53 - Getting balance using SQL for item number: ATC17002504

I have 2 logstash .conf files.
02-beats-input.conf

input {
  beats {
    port => 5044
  }
}

filter {
        grok {
                match => { "message" =>  "(?m)\[%{TIMESTAMP_ISO8601:timestamp}\] \[%{HOSTNAME:host}\] \[%{DATA:thread}\] %{LOGLEVEL:logLevel} %{DATA:class}@%{DATA:method}:%{DATA:line} \- %{GREEDYDATA:message}"}
  }
}

and 30-output.conf

output {
  elasticsearch {
    hosts => ["localhost"]
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

And my filebeat.yml is:

filebeat.prospectors:
- type: log
  enabled: true
  paths:
    - /opt/runnables/*/logs/application.log
  multiline.pattern: ^\[
  multiline.negate: true
  multiline.match: after

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml

  reload.enabled: true

setup.template.settings:
  index.number_of_shards: 3

setup.kibana:

output.logstash:
    hosts: ["elkhost:5045"]

The message JSON ends up like this:

{
  "_index": "filebeat-2018.06.20",
  "_type": "doc",
  "_id": "JliMGmQBM11e1Fl9U3KI",
  "_version": 1,
  "_score": null,
  "_source": {
    "offset": 66226,
    "method": [
      "getItemBalanceWithAdHocSql",
      "getItemBalanceWithAdHocSql"
    ],
    "line": [
      "53",
      "53"
    ],
    "prospector": {
      "type": "log"
    },
    "source": "/opt/runnables/item-balance-service/logs/application.log",
    "thread": [
      "http-nio-8586-exec-1",
      "http-nio-8586-exec-1"
    ],
    "message": [
      "[2018-06-19 20:16:35.398] [webservicedev.ourplace.com] [http-nio-8586-exec-1] DEBUG c.w.controller.ItemBalanceController@getItemBalanceWithAdHocSql:53 - Getting balance using SQL for item number: ATC17002504",
      "Getting balance using SQL for item number: ATC17002504",
      "Getting balance using SQL for item number: ATC17002504"
    ],
    "tags": [
      "beats_input_codec_plain_applied"
    ],
    "@timestamp": "2018-06-20T00:16:40.570Z",
    "logLevel": [
      "DEBUG",
      "DEBUG"
    ],
    "@version": "1",
    "beat": {
      "name": "webservicedev",
      "hostname": "webservicedev",
      "version": "6.2.2"
    },
    "host": [
      "webservicedev",
      "webservicedev.ourplace.com",
      "webservicedev.ourplace.com"
    ],
    "class": [
      "c.w.controller.ItemBalanceController",
      "c.w.controller.ItemBalanceController"
    ],
    "timestamp": [
      "2018-06-19 20:16:35.398",
      "2018-06-19 20:16:35.398"
    ]
  },
  "fields": {
    "@timestamp": [
      "2018-06-20T00:16:40.570Z"
    ]
  },
  "highlight": {
    "source": [
      "/opt/runnables/@kibana-highlighted-field@item@/kibana-highlighted-field@-@kibana-highlighted-field@balance@/kibana-highlighted-field@-service/logs/@kibana-highlighted-field@application.log@/kibana-highlighted-field@"
    ],
    "message": [
      "[2018-06-19 20:16:35.398] [webservicedev.ourplace.com] [http-nio-8586-exec-1] @kibana-highlighted-field@DEBUG@/kibana-highlighted-field@ c.w.controller.ItemBalanceController@getItemBalanceWithAdHocSql:53 - Getting balance using SQL for item number: ATC17002504"
    ]
  },
  "sort": [
    1529453800570
  ]
}

Thanks for any advice.
Mike

Hi,

This is a Logstash issue, as none of those fields (timestamp, host, thread, method, etc.) exist until the event arrives at Logstash. They are created by the grok filter in your 02-beats-input.conf.

I don't know enough logstash to help you with this, I suggest you open this issue in the #logstash forum.

OK. Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.