Can someone please help me with Grok Expression to forward Nginx access logs to elasticsearch using filebeat is 1st preference, if filebeat doesn't support filter then through logstash to elasticsearch using below Grok Expression.
Nginx Log:
[2019-07-17T17:51:20+00:00] vip="10.10.10.10" http_method="GET" uri="/errors/error.json" status_code="400" payload_size="0" http_ua="Apache-HttpClient/4.5.2 (Java/1.8.0_112)" request_time="0.060" response_time="-" env="tt" rid="59ce8cb4-9950" event_action="Created" request_uri="/event/consumer" affectedMarkets="us" version="v1" proxy_url=""
Grok Expression:
^[%{TIMESTAMP_ISO8601:timestamp}] %{DATA:vip} %{DATA:http_method} %{DATA:uri} %{DATA:status_code} %{DATA:payload_size} %{DATA:http_ua} %{DATA:data} %{DATA:request_time} %{DATA:response_time} %{DATA:env} %{DATA:rid} %{DATA:event_action} %{DATA:request_uri} %{DATA:affectedMarkets} %{DATA:version} %{DATA:proxy_url}
Result:
{
"timestamp": [
"2019-07-17T17:51:20+00:00"
],
"vip": [
"vip="10.10.10.10""
],
"http_method": [
"http_method="GET""
],
"uri": [
"uri="/errors/error.json""
],
"status_code": [
"status_code="400""
],
"payload_size": [
"payload_size="0""
],
"http_ua": [
"http_ua="Apache-HttpClient/4.5.2"
],
"data": [
"(Java/1.8.0_112)""
],
"request_time": [
"request_time="0.060""
],
"response_time": [
"response_time="-""
],
"env": [
"env="tt""
],
"rid": [
"rid="59ce8cb4-9950""
],
"event_action": [
"event_action="Created""
],
"request_uri": [
"request_uri="/event/consumer""
],
"affectedMarkets": [
"affectedMarkets="us""
],
"version": [
"version="v1""
],
"proxy_url": [
""
]
}
log file already has key:value pairs, when i do Grok expression result has key name twice, how can i make not to having key name twice into result?, is there better way of Grok expression?