Logstash - problems with date

Log example:
[2018-03-28 12:00:21] Processed: shop\Export\Jobs\ExportOrders
[2018-03-28 12:00:22] Processed: shop\Export\Jobs\ExportOrders

Logstash configuration:
input {
file {
path => "path/to/test/file.log"
start_position => "beginning"
}
}

filter {
    grok {
      match => { "message" => "\[%{GREEDYDATA:logtime}\] %{WORD:status}: %{GREEDYDATA:process}" }
    }
    date {
      timezone => "Europe/Prague"
      match => ["sourcestamp", "yyyy-MM-dd HH:mm:ss"]
      target => "@timestamp"
    }
}

output {
  elasticsearch {
    hosts => ["xxx.xxx.xxx.xxx:xxxx"]
  }
  stdout { codec => rubydebug }
}

Elastisearch output (partial, that matters):
"_source": {
"process": "shop\Export\obs\ExportOrders",
"@timestamp": "2018-04-23T11:12:05.458Z",
"message": "[2018-03-28 12:00:18] Processed: shop\Export\Jobs\ExportOrders",
"path": "/data/queue/timestamp_test1.log",
"status": "Processed",
"host": "kibana0",
"logtime": "2018-03-28 12:00:18",
"@version": "1"
}

The problem i have is, that I am unable to replace @timestamp with logtime. I know, there are many topics about this, but no solution there helped me, so i decided, to make new topic, with my exact configuration and log. I do not know, what should I do. Can you help me ?
Thanks, David

The field name should be logtime, not sourcestamp, as that is what is extracted.

Problem is, when I use the same variable, I have no error in logstash log, but elastisearch output is then empty

error": {
        "root_cause": [
            {
                "type": "index_not_found_exception",
                "reason": "no such index",
                "resource.type": "index_or_alias",
                "resource.id": "logstash-2018.04.23",
                "index_uuid": "_na_",
                "index": "logstash-2018.04.23"
            }
        ],
        "type": "index_not_found_exception",
        "reason": "no such index",
        "resource.type": "index_or_alias",
        "resource.id": "logstash-2018.04.23",
        "index_uuid": "_na_",
        "index": "logstash-2018.04.23"

Have you by any chance disabled automatic index creation in Elasticsearch?

No, so far as i know. Only changes i did to elastisearch config are hosts and port changes.
Actually when I ran with the configuration, you suggested (the variable logtime), it created folder in "indices" (elastisearch), like it always do, so i suggest, it recieved the data from logstash but perhaps corrupted. But then i wonder why i have no parse error from logstash.

What does the rubydebug output look like?

So far I am unable to capture rubydebug output directly. I am trying to do this with:

output {
    file {
      path => "/tmp/rubydebug_output.txt"
      codec => rubydebug
    }
}

But the file remains empty. I am not sure, if it´s supposed to be like that.

Ok, I have some new informations. When I use the config Christian suggested (the one, that makes elastisearch 404 error), then the rudydebug output is empty.
When I try different config, that gives some results.

input {
  file {
    path => "/data/queue/timestamp_test2.log"
    start_position => "beginning"
  }
}

filter {
    grok {
      match => { "message" => "%{WORD:status}: %{GREEDYDATA:process}" }
    }
    date {
      timezone => "Europe/Prague"
      match => ["logtime", "yyyy-MM-dd HH:mm:ss"]
      target => "@timestamp"
    }
}

output {
    file {
      path => "/tmp/rubydebug_output.txt"
      codec => rubydebug
    }
}

Then the output of rubydebug looks like:

"host" => "xxx",
    "@timestamp" => 2018-04-25T09:00:00.529Z,
       "process" => "shop\\Export\\Jobs\\ExportOrders",
      "@version" => "1",
          "path" => "/data/queue/timestamp_test2.log",
       "message" => "[2018-03-28 12:00:22] Processed: shop\\Export\\Jobs\\ExportOrders",
        "status" => "Processed"

You are basing the date filter on the logtime field, which you are not extracting in your grok filter (which does not match what your message field looks like anyway).

That is why I am confused, that it doesn´t work the way you said.
Ofcourse in the last example it can´t work, because, there is no logtime field. I just wanted to show, that when I change the config to something else (or having different variables), I have working output.

When I posted the issue, I had different variables. GREEDYDATA:logtime was there as a string looking like this "logtime": "2018-03-28 12:00:18"
So then I thought, the date must process it, using same variables, I had no parse error or anything like that, the output was simply emtpy.

Later I also tried

filter {
    grok {
      match => { "message" => "\[%{GREEDYDATA:logtime}\] %{WORD:status}: %{GREEDYDATA:process}" }
    }
    date {
      timezone => "Europe/Prague"
      match => ["%{logtime}", "yyyy-MM-dd HH:mm:ss"]
      target => "@timestamp"
    }
}

But ouptut is still empty. I don´t know, what should I try next.

I did a small test:

datefailtest.conf:

input { stdin{} }
filter {
    grok {
      match => { "message" => "\[%{GREEDYDATA:logtime}\] %{WORD:status}: %{GREEDYDATA:process}" }
    }
    date {
      timezone => "Europe/Prague"
      match => ["logtime", "yyyy-MM-dd HH:mm:ss"]
      target => "@timestamp"
    }
}
output { stdout{ codec => rubydebug } }

execution:

root@elasticsearch-vm:/# /usr/share/logstash/bin/logstash -f "/root/datefailtest.conf" --path.data="/var/lib/logstash-man"
Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties
The stdin plugin is now waiting for input:
[2018-03-28 12:00:21] Processed: shop\Export\Jobs\ExportOrders
{
"@timestamp" => 2018-03-28T10:00:21.000Z,
"@version" => "1",
"message" => "[2018-03-28 12:00:21] Processed: shop\Export\Jobs\ExportOrders",
"process" => "shop\Export\Jobs\ExportOrders",
"host" => "elasticsearch-vm",
"status" => "Processed",
"logtime" => "2018-03-28 12:00:21"
}

Please check your ES configuration again and make sure that this is not the reason for your empty output right now.

Wow, thanks alot, now I am getting somewhere. When I tried to run it, as you suggested, with minor changes.

input {
  file {
    path => "/data/queue/timestamp_test2.log"
    start_position => "beginning"
  }
}

filter {
    grok {
      match => { "message" => "\[%{GREEDYDATA:logtime}\] %{WORD:status}: %{GREEDYDATA:process}" }
    }
    date {
      timezone => "Europe/Prague"
      match => ["logtime", "yyyy-MM-dd HH:mm:ss"]
      target => "@timestamp"
    }
}

output {
    file {
      path => "/tmp/rubydebug_output.txt"
      codec => rubydebug
    }
}

And run it with command:

/usr/share/logstash/bin/logstash -f "/etc/logstash/conf.d/logstash-queue.conf"

Then I get the result i wanted, and file /tmp/rubydebug_output.txt contains:

{
       "message" => "[2018-03-28 12:00:22] Processed: Inspishop\\Export\\JoeAlex\\Jobs\\ExportOrders",
    "@timestamp" => 2018-03-28T10:00:22.000Z,
        "status" => "Processed",
      "@version" => "1",
          "host" => "kibana0",
       "process" => "Inspishop\\Export\\JoeAlex\\Jobs\\ExportOrders",
          "path" => "/data/queue/timestamp_test2.log",
       "logtime" => "2018-03-28 12:00:22"
}

But, when I run it as service ( service logstash start ) then it doesn´t do anything.
I suppose then I have some other config wrong.

Ok all problems solved. It was all my fault, like it always is. Firstly it is OK as service, i just had to add the data after I started the service. Then I still couldnt find it in elastisearch, when I redirected the output to ES. But it was there, only I had to search index, by the old date, not the new one.
Thank you very much for your help Jenni and Christian.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.