Parsing the logs

Hi, I went to a devOpsDays event and saw ELK for the first time. I've it installed locally to test it out. Its working with filebeat => ElasticSearch => Kibana.

I want to parse the message because it has the useful information I need. It looks like this:
2017-10-24 17:20:07,581 [19] INFO TestFunctionService START isFlagTrue

I'd love to have the message "split up" into available fields like this:
{TIMESTAMP} {THREAD} {ERROR_TYPE} {FUNCTION_TYPE} {MESSAGE}

How would I go about doing this correctly?

You can do this through an ingest pipeline in Elasticsearch or by introducing Logstash into your pipeline.

1 Like

Thanks Christian, I will look into the ingest pipeline now.

I was hoping to keep things simple and thought that by adding logstash it would add more unnecessary complexity. From the diagram on the ELK homepage it looked like I could have beats or logstash and that beats was lighter...

If I fail with ingest pipeline I will install logstash.

Ingest pipelines have a limited set of processors, but do support grok, so you should be fine.

1 Like

Thanks Christian, so, I've managed to use Grok to format the messages I've had into something that outputs well (using the debugger - I've yet to integrate it). During the process I've noticed that the logging I have is all over the place. I was wondering if there is a good standard perhaps in place to create the log files? For example is there a log4net helper function which perhaps takes in multiple parameters. I will probably end up creating one, I was just wondering if some standard method with a standard ingest pipleline existed?

I'm really struggling with this. It seems so simple, but there aren't any good examples.
I have a log.txt which now looks like this sample:
2017-11-30 11:06:25,933 [20] INFO DAL.DBManager "FunctionName":"Connect"
2017-11-30 11:06:25,940 [20] INFO DAL.DBManager "FunctionName":"ExecuteDataReader", "Query":"pkg_general.get_number_of_players"
2017-11-30 11:06:29,704 [20] INFO DAL.DBManager "FunctionName":"Dispose"

Currently in kibana the whole log comes up under Message but I need to break out message to being:
LogLevel, Class, FunctionName, Query

I don't know what file to modify.
I've gone through a few grok examples and I can get the output I desire. I don't know where to implement it.

One normally uses Logstash or Elasitcsearch Ingest node to apply the grok parsing.

Hi Steffen, I will remove filebeat and restart with logstash. Thanks for the reply.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.