Load Epoch time

Hi ,

I'm trying to convert epoch time to human date.
In my json file I tried to put the value of the field with "" and without "".
I saw some answers about this in the web but in my case it doesn't work.
The name of the field I want to change is startTime.
This is my conf file:

input {
file{
path => ["/tmp_31.json"]
type => "json"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter{
grok {
match => [ 'message', '(?"TestName":.*"Agent":"[^"]+")' ]
}
date {
match => [ "startTime", "UNIX" ]
}
json {
source => "message"
}
}

output {
stdout {
codec => rubydebug
}
elasticsearch {
host => "xx.xxx.xx.xx"
protocol => "http"
index => "index_client"
}
}

BR,
Chen

match => [ 'message', '(?"TestName":.*"Agent":"[^"]+")' ]

Don't use a grok filter to parse JSON. Use the json codec or filter. Or what's the point of this filter? You're not extracting any fields from the source string so it seems pretty pointless.

match => [ "startTime", "UNIX" ]

There is no startTime field when this filter runs. Try placing this filter after the json filter instead.

I removed the grok filter , and add to the filter
match => [ "startTime", "UNIX" ]
This is the conf file filter after I changed it:
filter{
date {
match => [ "startTime", "UNIX" ]
}
json {
source => "message"
}
}

But it still doesn't convert the startTime.
Is it matter if you put the value of the startTime as string or as number in the json file ?

But it still doesn't convert the startTime.

Logstash won't replace the startTime value. It'll populate the @timestamp field. If that's not what you want you need to set the target option for the date filter.

If you need further help you need to show us what the events look like, preferably by showing the output of a stdout { codec => rubydebug } output.

Is it matter if you put the value of the startTime as string or as number in the json file ?

I'm pretty sure it doesn't matter.

This is my Json file:
{"build_name":"UT" ,"build_number":80 ,"startTime":1458024571583 ,"result":"FAILURE" ,"duration":179725}

This is my conf file after I changed it:
input {
file{
path => ["/tmp_31.json"]
type => "json"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter{
date {
match => ["startTime", "UNIX"]
target => "@timestamp"
}
json {
source => "message"
}
}

output {
stdout {
codec => rubydebug
}
elasticsearch {
host => "xx.xxx.xx.xx"
protocol => "http"
index => "index_client"
}
}

print screen after I load the json file:

So I'm still doing something wrong ....

I repeat: Put the date filter after the json filter.

Where does the Time field come from? There's nothing in your configuration and the input JSON object that creates it.

Hi ,

I also tried to put the time filter after the json filter (with and without target).
I can see the new loaded index with the new record under the list of indexes in the ElasticSearch (using the command curl 'localhost:9200/_cat/indices?v') but I can't reach the data in the Kibana

filter{
json {
source => "message"
}
}

filter{
date {
match => ["startTime", "UNIX"]
target => "@timestamp"
}
}

The time is added automatically as the load time of the data into the ElasticSearch , I don't define in the conf file or in the Json file this field (It's also not part of the fields of the index_client I create).
I also tried the following things:

  • to change the startTime in the Json file to string
  • create 1 filter and put the data after the json
    It still doesn't work.

BR,
Chen

Your epoch is in milliseconds, so use UNIX_MS instead of UNIX.

2 Likes

Thanks for all your support !!!
now it works :slight_smile: