Thanks for the clarification @Badger
I am facing an issue in @timestamp -- where I expect the @timestamp value will be my local time instead of that I am getting different value,
below is my logstash configuration i am using to send messages to kafka
input {
file {
add_field => { "[@metadata][type]" => "response_log" }
path => "/usr/local/tomcat/logs/test.log"
start_position => "beginning"
}
}
filter {
if [@metadata][type] == "response_log" {
csv {
columns => [datetime,request_ip,domain,host,method,uri,response_code,response_time,client_id,payload,headers,exception_type,exception_message,api_response_code,api_response_time,user_id,api_type,extra_field]
separator => " "
quote_char => " "
convert => {
"response_time" => "integer"
"api_response_time" => "integer"
"api_type" => "integer"
}
}
ruby {
init => 'require "date"'
code => 'event.set("processing_time", event.get("response_time").to_i - event.get("api_response_time").to_i); event.set("ts", DateTime.strptime(event.get("datetime") + "+09:00","%F %T.%L%z").strftime("%s"))'
}
mutate {
convert => {
"ts" => "integer"
"processing_time" => "integer"
}
remove_field => [
"path",
"extra_field"
]
}
}
}
Output {
<message sent to kafka topic>
}
sample logs,
2023-01-05 15:27:39.868 601.86.226.128 app.test.co stg-7dcbcb5965-drm GET /api/Travel/VacantRoomCalendar/20140716 401 0 API_REQUEST_ID=efbc5f2c7-4a6b-b66b-f68bc21bab63 x-forwarded-for=10.80.23.72 OAuthError specify valid access token 0 0 0 1
Once the logstash send message kafka and my another logstash configuration is running to read the kafka topic message and push it to elasticsearch and in this configuration, I am converting the datetime field to UTC and set it to target field @datetime
input {
<messages are read from kafka topic>
}
filter {
if [@metadata][kafka][topic] == "test" {
date {
timezone => "UTC"
target => "@datetime"
match => [ "datetime", "ISO8601" ]
}
}
}
output {
<message index to elasticsearch>
}
but what happened was when I check the index in elasticsearch I found that @timestamp value is subtracted -9 hrs from the @datetime field values and stored as like this
"@timestamp": "2023-01-05T06:27:39.000Z"
and below is the @datetime value stored in the same index,
"@datetime": "2023-01-05T15:27:39.868Z"
json stored in elasticsearch index
"_source": {
"method": "GET",
"ts": 1672900059,
"exception_type": "OAuthError",
"api_type": 1,
"request_ip": "601.86.226.128",
"@datetime": "2023-01-05T15:27:39.868Z",
"uri": "/api/Travel/VacantRoomCalendar/20140716",
"api_response_code": "0",
"@timestamp": "2023-01-05T06:27:39.000Z",
"processing_time": 0,
"payload": "API_REQUEST_ID=efbc5f2c--f68bc21bab63",
"user_id": "0",
"host": "stg-7dcbcb5965-drm4k",
"datetime": "2023-01-05 15:27:39.868",
"domain": "app.test.com",
"api_response_time": 0,
"@version": "1",
"response_code": "401",
"response_time": 0,
"client_id": null,
"exception_message": "specify valid access token",
"headers": "x-forwarded-for=10.80.23.72"
}
Can you please help me to understand why @timestamp value indexing like this, I expect the @timestamp value to be my local timezone value.(I mean the current time in the server)
at the same time if I removed this condition in rube code and process it, @timestamp value stored it as the current time
event.set("ts", DateTime.strptime(event.get("datetime") + "+09:00","%F %T.%L%z").strftime("%s"))'
Kindly share your thoughts.
Please correct me If I am doing anything wrong.
Thanks,
Ganeshbabu R