Can I replace the @timestamp of the dashboard with the time at which the event get logged in the file

Hi there,

I wanted to replace the @timestamp value with the log time. I am using filebeat as a logs collector.

Here is my filter value which I am trying

grok {
          match => { "message" => "\[%{DATA:datetime}\] INFO \[.*\] %{GREEDYDATA:logger_json}" }
      }

      date {
          match => ["DATA:timestamp", "dd-MMM-yyyy HH:mm:ss"]
          target => "@timestamp"
      }

      json {
          source => "logger_json"
          target => "logger_json_parsed"
      }

And here is the sample of the log messages

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99707

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99708

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99709

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99710

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99711

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99712

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99713

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99714

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99715

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99716

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99717

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99718

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99719

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99720

[13-Mar-2024 07:42:38] INFO [:10] [GenAIVoiceBot] Server 1 app log number 99721

Please help

You should use the datetime field, not timestamp.

      date {
          match => ["datetime", "dd-MMM-yyyy HH:mm:ss"]
          target => "@timestamp"
      }

I do not why but after replacing DATA:timestamp with datetime, the logs are not getting pushed to OpenSearch

It started pushing logs by using the below config

filter {
  if [fields][log_type] == "monitor" {
    mutate {
      add_field => { "log_location" => "call_monitoring_log" }
    }

    grok {
       match => { "message" => "\[%{DATA:datetime}\] INFO \[.*\] %{GREEDYDATA:logger_json}" }
   }

   date {
       match => ["DATA:datetime", "dd-MMM-yyyy HH:mm:ss"]
       target => "@timestamp"
   }

   json {
       source => "logger_json"
       target => "logger_json_parsed"
   }
}

}

but still the timestamp and the datetime is not same

OpenSearch/OpenDistro are AWS run products and differ from the original Elasticsearch and Kibana products that Elastic builds and maintains. You may need to contact them directly for further assistance.

(This is an automated response from your friendly Elastic bot. Please report this post if you have any suggestions or concerns :elasticheart: )

Can you show JSON record from Kibana before changing? What is in "tags"
Have you tested your new .conf after date changes?

The logger_json field doesn't contain any JSON structure, it's plain text.

{
               "message" => "[18-Mar-2024 23:30:41] INFO [<string>:49] [GenAIVoiceBot] 998 {\"@timestamp\": \"Mar 18, 2024 @ 23:30:41.954588\"}",
              "@version" => "1",
                   "ecs" => {
        "version" => "8.0.0"
    },
                "fields" => {
        "log_type" => "monitor"
    },
          "log_location" => "call_monitoring_log",
    "logger_json_parsed" => {
        "@timestamp" => "Mar 18, 2024 @ 23:30:41.954588"
    },
    
            "@timestamp" => 2024-03-18T18:00:42.233Z,
             "container" => {
        "id" => "call_monitoring_logger.log"
    },
           "logger_json" => "998 {\"@timestamp\": \"Mar 18, 2024 @ 23:30:41.954588\"}",
                  "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
    
                   "log" => {
        "offset" => 552379,
          "file" => {
                 "path" => "...",
                "inode" => "6195882"
        }
    },
                 "cloud" => {
                 "provider" => "openstack",
                 "input" => {
        "type" => "filestream"
    },
              "datetime" => "18-Mar-2024 23:30:41"
}

here is the sysout log, I am using logstash for my usecase

And which the @timestamp field should be used: at the begging [18-Mar-2024 23:30:41] or at the end: @timestamp: "Mar 18, 2024 @ 23:30:41.954588" ?

The first one should be used

You have wrongly added DATA in match, here: match => ["DATA:datetime"

I have changed dates to be more visible.

input {
  generator {
       "message" => '[11-Mar-2024 21:10:11] INFO [<string>:49] [GenAIVoiceBot] 998 {"@timestamp": "Mar 18, 2024 @ 23:30:41.954588"}'
	   count => 1 }
 
} 

filter {
  #if [fields][log_type] == "monitor" { # temporarily commented
    mutate { add_field => { "log_location" => "call_monitoring_log" } }

    grok { match => { "message" => "\[%{DATA:datetime}\] INFO \[.*\] %{GREEDYDATA:logger_json}" } }

   date {
       match => ["datetime", "dd-MMM-yyyy HH:mm:ss"]
       target => "@timestamp"
   }

   json {
       source => "logger_json"
       target => "logger_json_parsed"
   }
#}

}

output {
    stdout {codec => rubydebug{} }
}

Result:

{
            "@timestamp" => 2024-03-11T20:10:11.000Z,
               "message" => "[11-Mar-2024 21:10:11] INFO [<string>:49] [GenAIVoiceBot] 998 {\"@timestamp\": \"Mar 18, 2024 @ 23:30:41.954588\"}",
              "datetime" => "11-Mar-2024 21:10:11",
           "logger_json" => "998 {\"@timestamp\": \"Mar 18, 2024 @ 23:30:41.954588\"}",
    "logger_json_parsed" => {
        "@timestamp" => "Mar 18, 2024 @ 23:30:41.954588"
    },
          "log_location" => "call_monitoring_log"
}
  • "@timestamp" will get the value from the first field and will be as the date type.
  • datetime will be as string field. If you don't need you can replace with [@metadata][datetime] or simply remove_field.
  • [logger_json_parsed][@timestamp] is another field as string.

Okay so what is the finally config, I am confused a bit now,

Tried the same which you told but now the logs are not getting pushed to OpenSearch

Here is the error which I am getting

Error parsing json {:source=>"logger_json", :raw=>"{'@timestamp': '2024-03-19T15:22:43.911625'}", :exception=>#<LogStash::Json::ParserError: Unexpected character (''' (code 39)): was expecting double-quote to start field name

If I want the last timestamp to be there instead of first what would be the conf?

You should add after json:

   date {
       match => ["[logger_json_parsed][@timestamp]", "MMM dd, yyyy @ HH:mm:ss.SSSSSS"]
       target => "@timestamp" # or any other name
   }

No it is not working

Also In the above answer which you have given for datetime, I changed the datetime to UTC in the logger and due to this the logs are getting pushed to OpenSearch so is there any way to convert @timstam to UTC after mapping so that I do not need to refractor my code much.

@Rios please help this is bit urgent, help me with fixing the below command as per the need

sudo docker run --restart unless-stopped --log-driver local --log-opt max-size=10m --memory="1g" --cpus="1.0" --oom-kill-disable -dit -p port:port -v mount:/var/log/logstash --name test --user=root opensearchproject/logstash-oss-with-opensearch-output-plugin:7.16.2 -e '
input {
  beats {
    port => 5045
  }
}
filter {
if [fields][log_type] == "monitor" {
    mutate {
        add_field => { "log_location" => "call_monitoring_log" }
    }

    grok {
        match => { "message" => "\[%{DATA:datetime}\] INFO \[.*\] %{GREEDYDATA:logger_json}" }
    }

    date {
        match => ["datetime", "dd-MMM-yyyy HH:mm:ss"]
        target => "@timestamp"
    }

    json {
        source => "logger_json"
        target => "logger_json_parsed"
    }
  }
}
output {
if [log_location] == "call_monitoring_log" {
    opensearch {
    }

    stdout {
      codec => rubydebug
    }
  }
}'

[19-Mar-2024 22:19:18] INFO [:31] [GenAIVoiceBot] {'@timestamp': 'Mar-19-2024 @ 22:19:18.689375'}

[19-Mar-2024 22:25:12] INFO [:31] [GenAIVoiceBot] {'@timestamp': 'Mar-19-2024 @ 22:25:12.960077'}

[19-Mar-2024 22:28:57] INFO [:31] [GenAIVoiceBot] {'@timestamp': 'Mar-19-2024 @ 22:28:57.907195'}

[19-Mar-2024 22:29:40] INFO [:31] [GenAIVoiceBot] {'@timestamp': 'Mar-19-2024 @ 22:29:40.122095'}

is the log format

After replace @timestamp with the datetime is should change to UTC time and push to OpenSearch

logstash cannot parse JSON that uses single quotes. Try

mutate { gsub => [ "logger_json", "'", '"' ] }
1 Like

For the last case:

input {
  generator {
       #"message" => '[11-Mar-2024 21:10:11] INFO [<string>:49] [GenAIVoiceBot] 998 {"@timestamp": "Mar 12, 2024 @ 23:30:31.954588"}'
       "message" => "[19-Mar-2024 22:25:12] INFO [:31] [GenAIVoiceBot] {'@timestamp': 'Mar-19-2024 @ 22:25:12.960077'}"
	   count => 1 }
 
} 

filter {
  #if [fields][log_type] == "monitor" {
    mutate { add_field => { "log_location" => "call_monitoring_log" } }

    grok { match => { "message" => "\[%{DATA:datetime}\] INFO \[.*\] %{GREEDYDATA:logger_json}" } }

   # date {
       # match => ["datetime", "dd-MMM-yyyy HH:mm:ss"]
       # target => "@timestamp"
   # }

   mutate { gsub => [ "logger_json", "'", '"' ] }

   json {
       source => "logger_json"
       target => "logger_json_parsed"
   }


   date {
       match => ["[logger_json_parsed][@timestamp]", "MMM-dd-yyyy @ HH:mm:ss.SSSSSS", "MMM dd, yyyy @ HH:mm:ss.SSSSSS"]
       target => "@timestamp"
   }

#}

}

output {
    stdout {codec => rubydebug{} }
}

Result:

{
              "datetime" => "19-Mar-2024 22:25:12",
    "logger_json_parsed" => {
        "@timestamp" => "Mar-19-2024 @ 22:25:12.960077"
    },
               "message" => "[19-Mar-2024 22:25:12] INFO [:31] [GenAIVoiceBot] {'@timestamp': 'Mar-19-2024 @ 22:25:12.960077'}",
          "log_location" => "call_monitoring_log",
           "logger_json" => "{\"@timestamp\": \"Mar-19-2024 @ 22:25:12.960077\"}",
            "@timestamp" => 2024-03-19T21:25:12.960Z
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.