Change default logstash @timestamp with a timestamp from my logs


(Nithin Nk) #1

Hi Have a json log file which looks like:

{"tenant_id":"e100118","component_job_id":56153,"component_status":"ERROR","system_id":"GBT204","business_type":"Test","application_id":"e100118tmn","error_since_dtm":"22-05-2018 15:17:28 UTC"}

I transferred the logs to elasticsearch using logstash and i am able to see the data. But since I am doing a one time upload, the timestamp created is the logstash timestamp which show the time when I do the upload. But I need it to look like the data is uploaded with the timestamp in my log file.

For example :
when i upload the data, index is created with a value @timestamp : todays date....
but i need to replace this @timestamp with the time of error_since_dtm....

Is that possible?


(Nithin Nk) #2

This is how config file looks like

Input

input {
file {
path => "/opt/json/file.json"
type => "json" # a type to identify those logs (will need this later)
start_position => "beginning"
sincedb_path => "/dev/null"
exclude => "*.gz"
}
}

filter :

filter{

json{
source => "message"
}

date {

    locale => "en"
    match  => ["error_since_dtm","dd-MM-YYYY HH:mm:ss"]
    timezone => "UTC"
    "target" => "@timestamp"
}

}

output

output
{
elasticsearch {
codec => json
hosts => ["10.47.43.67:9200"]
index => "pkpkpkjson"
}

stdout { codec => rubydebug }

}


#3

error_since_dtm has a timezone on it, so this should be

match  => ["error_since_dtm","dd-MM-YYYY HH:mm:ss ZZZ"]

You have to match the whole field.


(Nithin Nk) #4

Super..it worked...I missed it :frowning:


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.