Timestamp from log files to @timestamp


!NOTE! i'm new to the elk stack in all its facettes.

I have some Problems with displaying my logfiles from an laravel application.

I'm running laravel on one server and my elk stack on another i installed logstash on the laravel server aswell and wrote my logstash conf for the laravel logs.

looks like this

input {
        file {
                path => "/var/www/html/storage/logs/laravel.log"
                start_position => "beginning"
                ignore_older => 0
                codec => multiline {
                        pattern => "\[[\d]{4}"
                        negate => "true"
                        what => "previous"
filter {
        if [type] == "laravel" {
                grok {
                        match => {
"message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{DATA:env}\.%{DATA:severity}: %{DATA:message} \[" }                 }
                date {
                        match => [ "timestamp", "yyyy-MM-dd HH:mm:ss" ]
                        target => "@timestamp"
                        locale => "en"
                        timezone => "UTC"
output {
        elasticsearch {
                hosts => ["XXX.XXX.XX.XX:9200"]
                index => "laravel-%{+YYYY.MM.dd}"

It is displayed an the messaged are also shown etc the only problem is with the date filter im not able to get it configured that the filter regognizes the original timestamp from the log itself and writes the timestamp there. also is there any possibility to arrange an filter that the severity will be transformed in another field for the es index.

Thanks in advance

Cheers :slight_smile:


What do your messages looks like? You didn't share any sample of your messages, without it is not possible to replicate your pipeline and see if it has any errors.

Looking at the filter you shared you have everything inside a conditional:

if [type] == "laravel"

But your input does not have type => laravel, so this conditional will never match and the plugins inside it will never work.

Remove this conditional as it does not make much sense to have it.

Hey, removed the "type" condition, found that on stackoverflow thought maybe it would be a good usage :sweat_smile:

A eample message would be like that...

[2023-10-11 10:05:16] local.ERROR: No alive nodes found in your cluster {"userId":1772,"email":"test@test.de","exception":"[object] (Elasticsearch\\Common\\Exceptions\\NoNodesAvailableException(code: 0): No alive nodes found in your cluster at /var/www/html/vendor/elasticsearch/elasticsearch/src/Elasticsearch/ConnectionPool/StaticNoPingConnectionPool.php:53)

Content inside can be ignored its an old one.

Running your pipeline with this sample message it shows that the grok filter is not working, so it will not parse your message, which will not create the timestamp field.

You need to change your grok filter to something like this:

    grok {
        match => {"message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{DATA:env}\.%{DATA:severity}: %{DATA:msg}\n\[" }
        remove_field => ["message"]

I added the \n that you have before the [ of the stack trace and changed the message field to msg, grok will not replace the original message field, so you need to save your data in a different field.

There is no error in your date filter, but one thing that you need to pay attention.

timezone => "UTC"

The timezone option is used to specify the timezone of the source date time field string when it is not in UTC, but if your source date time field is already in UTC, you do not need this, it will basically change nothing.

Is this 2023-10-11 10:05:16 already in UTC? If it is in UTC you can remove the timezone option from the date filter, if this is not in UTC in need to specify the timezone of this date.

This is the result I got:

[2023-11-01T10:33:53,456][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
           "env" => "local",
    "@timestamp" => 2023-10-11T10:05:16.000Z,
           "msg" => "No alive nodes found in your cluster {\"userId\":1772,\"email\":\"test@test.de\",\"exception\":\"[object] (Elasticsearch\\\\Common\\\\Exceptions\\\\NoNodesAvailableException(code: 0): No alive nodes found in your cluster at /var/www/html/vendor/elasticsearch/elasticsearch/src/Elasticsearch/ConnectionPool/StaticNoPingConnectionPool.php:53)",
     "timestamp" => "2023-10-11 10:05:16",
      "@version" => "1",
          "host" => "logstash-lab",
      "severity" => "ERROR",
          "tags" => [
        [0] "multiline"
[2023-11-01T10:33:53,652][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}

1 Like

That's the solution perfect, the \n. Now it seems that it will recognice anything.

Was never a fan of Regex and now i think grokfilters will be my new enemy.

Thank you very much.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.