How to parse logs to nested json field then store in Elasticsearch?

I want to parse AWS Elastic Loadbalancer Logs in to pattern that I can query for find the response time of each request does it fail or not, does request URI have fail from 404 or 500 ? or it it success from 200 somewhat like that.
The pattern of logs from AWS ELB :

client:port backend:port request_processing_time backend_processing_time response_processing_time elb_status_code backend_status_code received_bytes sent_bytes

For example I have log format like this

192.168.131.39:2817 10.0.0.1:80 0.000073 0.001048 0.000057 200 200 0 29

I want parse it to

{
    "client": {
        "request_client": {
            "full_url": "192.168.131.39:2817",
            "host": "192.168.131.39",
            "port": "2817",
            "status_code": 200,
            "received_byte": 0,
            "request_processing_time ": 0.000073
        },
        "backend":{
            "full_url": "10.0.0.1:80",
            "host": "10.0.0.1",
            "port": "80",
            "status_code": 200,
            "sent_byte": 29,
            "backend_processing_time": 0.001048,
            "response_processing_time": 0.000057
        }

    }
}

Thank you in advance or perhaps you can provide the tutorial link where I can read about it ? for now I have try Grok debugger it quite mess but it my first time thank you very much :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.