Can you create a standard log entry that logstash will automatically parse

I'm creating new logs (using log4net) and I was wondering if there is a standard where you can create an entry that Kibana will be able to come up with the available fields automatically?

I just spotted this in another post:
kv {
value_split => ":"
field_split => ","

Would that be the answer? It looks like a Key Value split... perhaps its custom?

To minimize the amount of processing in Logstash I suggest you produce JSON log files (one JSON object per line).

1 Like

Hi Magnus, I really appreciate the feedback. Thank you. I've added JSON to my logs (using and am getting back this

{"date":"2017-12-15T10:56:32.4665176+00:00","level":"DEBUG","appname":"Default Web Site","logger":"Project1.Class1","thread":"27","ndc":"(null)","message":"\"Function\":\"someFunction\", \"QID\":\"123\",\"ItemId\":\"193870\", \"UserId\":\"61699\", \"Destination\":\"somePage\", \"Message\":\"Debugging someFunction ActionResult\""}
{"date":"2017-12-15T10:56:34.2633558+00:00","level":"INFO","appname":"Default Web Site","logger":"Project1.Class2","thread":"27","ndc":"(null)","message":"\"Function\":\"dbConnect\""}

It looks ok, but everything is in message. I'll have to find out if I can split that up. Any ideas?

Where is the message string constructed? In the application itself or the logging stack? Ideally we'd be able to avoid serializing it in the first place, but if that's not possible a kv filter is probably the best bet.

I've taken another step forward, I'll add the solution here in case anyone else goes on the same path as me.

Say I want to add "Function" as a field in Kibana I would have a web.config which would look like this:

<layout type="log4net.Layout.SerializedLayout, log4net.Ext.Json">
    <decorator type="log4net.Layout.Decorators.StandardTypesDecorator, log4net.Ext.Json" />
    <default />
    <member value="Function" />

Then in the code where I create the log I would have this:

log4net.LogicalThreadContext.Properties["Function"] = FunctionName;

The final outcome would be this:

{"date":"2017-12-15T11:34:38.1718475+00:00","level":"INFO","appname":"Default Web Site","logger":"DAL.DBManager","thread":"22","ndc":"(null)","message":"","Function":"ExecuteDataReader"}
{"date":"2017-12-15T11:34:38.4541514+00:00","level":"INFO","appname":"Default Web Site","logger":"DAL.DBManager","thread":"22","ndc":"(null)","message":"","Function":"Dispose"}

Magnus, now that I have JSON as my logs, what do I need to do next? Is it as easy as some sort of filter in logstash.conf?

Magnus, now that I have JSON as my logs, what do I need to do next?

Use a file input with codec => json_lines and an elasticsearch output.

1 Like

My logs look ok, but I'm not seeing anything in Kibana. The logsash.conf looks like this

input { 
	    codec => json_lines
	    path => "C:\dev\Source\appCenter\Log\Log.txt"
output {
  elasticsearch { hosts => ["localhost:9200"] }

Logstash is tailing the file and waiting for more data to be appended. Please read about sincedb in the file input docs and search the archives.

damn, I thought I was so close to getting this working. I'll search the docs and archives.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.