Hi, I've been trying to create index on json log file that are generated by Log4j2 and viewed by Kibana. Everything is working fine except LogStash skips last object on the log file, next time when a log file is changed it gets that object but skips newly added object(new "last" one) to the file.
Here is my conf file:
[code]
input {
file {
codec => multiline
{
pattern => '^{'
negate => true
what => previous
}
path => "C:/logs/webapp-errors.log"
#codec => json
start_position => "beginning"
sincedb_path => "D:/Development/logstash-2.2.0/logstash-2.2.0/dbfile/null"
}
}
filter
{
json { source => message }
}
output {
elasticsearch {
hosts => ["localhost:9200"]
codec => "json"
}
stdout { codec => rubydebug }
}[/code]
and the log file looks like:
{
"timeMillis" : 1456950611518,
"thread" : "http-nio-8080-exec-14",
"level" : "WARN",
"loggerName" : "org.hibernate.util.JDBCExceptionReporter",
"message" : "SQL Error: 0, SQLState: 42601",
"endOfBatch" : false,
"loggerFqcn" : "org.apache.logging.slf4j.Log4jLogger"
}
{
"timeMillis" : 1456950611519,
"thread" : "http-nio-8080-exec-14",
"level" : "ERROR",
"loggerName" : "org.hibernate.util.JDBCExceptionReporter",
"message" : "ERROR: syntax error at or near \")\"\n Position: 7055",
"endOfBatch" : false,
"loggerFqcn" : "org.apache.logging.slf4j.Log4jLogger"
}
I haven't tried to pick timeMIllis as @timeStamp since LogStash reading it run-time and gets current timestamp. When I check timeMillis to convert datetime, I realized it's in UTC. Does it might be the cause?
Thanks