Logstash timestamp - not reading timestamp from log files

Hi guys,

wandering if anybody can help with advice, suggestion on configuring logstash for properly reading timestamp from custom log files.
I use logstash to read custom logs and forward to Elasticssearch - in the end displaying in Kibana4, all works well except that my timestamp from log files is not read properly. When i display it in Kibana from @timestamp - it shows time when logstash read the event, and not actual timestamp from log when the event actually occurred - I assume i provided wrong timestamp format, just not sure how to change it correctly.

Here is example from custom log file:

2016-02-19 09:02:45.103: ERROR: 24Failed to read current staging area assets: System.Data.Entity.Core.EntityException: The underlying provider failed on Open. ---> System.Data.SqlClient.SqlException: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - No connection could be made because the target machine actively refused it
2016-02-19 09:01:46.103: ERROR: 25Failed to read current staging area assets: System.Data.Entity.Core.EntityException: The underlying provider failed on Open. ---> System.Data.SqlClient.SqlException: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - No connection could be made because the target machine actively refused it
2016-02-19 09:01:47.103: ERROR: 26Failed to read current staging area assets: System.Data.Entity.Core.EntityException: The underlying provider failed on Open. ---> System.Data.SqlClient.SqlException: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - No connection could be made because the target machine actively refused it

Here is my configuration logstash :

file {
path => "/Logs/*"
sincedb_path => "/var/lib/logstash/.sincedb_logs"
start_position => "end"
type => customlogs
stat_interval => 2
}

}

filter {
if [type] == "customlogs" and [message] =~ "ERROR:" {

grok {
  overwrite => "message"    
  match => { "message" => "\A%{TIMESTAMP_ISO8601}: %{DATA:level} %{GREEDYDATA:message}" }      
}
#Lets rename ":" 
mutate {
      gsub => [ "level", ":",""]
}

} else {
drop { }
}

}

output {
elasticsearch { host => "ELK-VS-100"
cluster => "ELK100" }
stdout { codec => rubydebug }
}

Any feedback, suggestion on this is very much appreciated.
Thank You very much in advanced.

You need to capture the timestamp into a field using your grok pattern, and then use the date filter to populate the @timestamp field based on this field.

Hi,

yes that's what i thought - wondering if anyone can help to provide example how to catch date in format of:
2016-02-19 09:01:46.103 - from log.

Can You please provide an example for this ?

Thank You very much!

You're already using the TIMESTAMP_ISO8601 pattern. Just capture that match into a field just like you're doing with the level and message fields.

Appreciate the help. Thanks!