Timestamp field issue

Hi there,
I'm using S3 to read input into logstash. The timestamp field generated in Elasticsearch is the time when logs are read from S3 and not the timestamp field of the json log file.
Below is one object of the json log file read from s3

{"name":"elk","hostname":"DESKTOP","pid":7440,"level":30,"shortName":"testin","data":{"value":"EN","message":"SET Default language as EN"},"timestamp":"2022-04-27 17:42:51.487","msg":"Default language","v":0}

Please help me in assigning the correct timestamp field

Maybe you can use the date plugin for this. You can assign a target in the configuration and this target can be the @timestamp field.

I'll send 2022-04-27 17:42:51.487 just as a message. Just to give an example of how it works:

You can use the following filtering:

filter {
    date{
      match => [ "message", "yyyy-MM-dd HH:mm:ss.SSS" ]
      target => "@timestamp"
    }
} 

And checking the output we can see it fits with the data you have in the timestamp field.

{
          "host" => "elastic",
    "@timestamp" => 2022-04-27T17:42:51.487Z,
       "message" => "2022-04-27 17:42:51.487",
          "path" => "/var/log/time4.log",
      "@version" => "1"
}

Check the documentation. You can also add the timezone as another directive in the filter.

Thanks a lot for you're help. This solution worked.