I 'm new to the ELK stack, I 'm very fond of it and want to implement it for my application logs. The issue is I 'm not able to really make my application logs structured.
I 'm using beats to ship the logs to my logstash, it's a multi-line log file (the timestamp is new event in logstash, beats is able to do that). Now the log file consists of both requests and responses. Because of multi-line and not standard timestamp, I 'm not able to use grok and date filters together.
Sample logs: [2017-Aug-04 14:41:52.732729] [0x00007f8e8159f700] [level:9] Receive: Request from host 18.104.22.168:49843 code=1, id=22, length=348 User-Name = "email@example.com" Password = "password" User-Service-Type = Framed-User Framed-Protocol = PPP Chargeable-User-Identity = "\00" Client-Id = 22.214.171.124 [2017-Aug-04 14:41:52.749042] [0x00007f8e8159f700] [level:9] Sending Code=2, Id=22 to 126.96.36.199.247:49843 ERX-Ingress-Policy-Name = "1M-upstream" ERX-Egress-Policy-Name = "2M-downstream" ERX-IPv6-Delegated-Pool-Name = "default-v6"
What I 'm trying to do here:
- Extract the timestamp from the request.
- Segregate the requests and responses (by using the keyword "Sending" or "Recieve" in the 1st line.
- Capture the value of "Code" appearing in the 1st line.
If I get help on the above questions based on that, I think I would be able to derive more rules of my own.