I want to parse my Json-Log into Logstash -> ElasticSearch and use it in my .Net-Core Application with NEST / ElasticSearch.NET.
My Log-File:
{"LogDate:":"2018-01-07T16:17:00.171Z", "ID":"1","LogLevel":"WARNING"}
My logstahs-config:
input
{
file
{
path => "C:/Log/ESTest/*.json"
ignore_older => 86400
start_position => "beginning"
}
}
filter
{
json
{
source => "message"
}
}
output
{
stdout { codec => rubydebug }
elasticsearch
{
index => "log"
hosts => ["localhost:9200"]
}
}
Console-output shows, that the fields got parsed well.
{
"path" => "****",
"LogLevel" => "WARNING",
"@version" => "1",
"message" => "{\"LogDate:\":\"2018-01-07T16:17:00.171Z\", \"ID\":\"1\",\"LogLevel\":\"WARNING\"}\r",
"ID" => "1",
"@timestamp" => 2018-03-13T13:44:04.879Z,
"LogDate:" => "2018-01-07T16:17:00.171Z",
"host" => "****"
}
Even in Kibana, i saw, that every field from this JSON-string got parsed. If i go to my .NET-Core Application and crete my Query to get my Log (just an Example).
public class log
{
public DateTimeOffset LogDate { get; set; }
public int ID { get; set; }
public string LogLevel { get; set; }
public Guid GUID { get; set; }
public string Path { get; set; }
public string Order { get; set; }
public string Host { get; set; }
}
var search = client.Search<log>(s => s
.AllTypes()
.Index("log")
.From(0)
.Size(1000)
);
Each Value was parsed, except the LogDate-field. LogDate is empty and after i try to change the type of Log Date, the effect stay the same.
LogDate {01.01.0001 00:00:00 +00:00} System.DateTimeOffset
LogDate {01.01.0001 00:00:00} System.DateTime (type was changed)