I am trying to create index for a squid log....although the filebeat creates the pipeline for logstash but due to some error in my grok I am not able to segregate the logs..P.S help me..
sample of my log file:
"
551728235.295 866048 10.130.2.58 TCP_TUNNEL/200 8429 CONNECT play.google.com:443 artidesh HIER_DIRECT/172.217.160.206 -
1551728237.817 61121 10.27.2.174 TCP_MISS/200 319 GET http://su.ff.avast.com/R/A28KIGJmYzZlYWM5OGFkNDQzYWI5OWU5MmQ5ZThiY2RmMWVkEgQGBwIZGHgiAf8qBwgEEOS292syCggAELC492sYgAI465eMkAFCINs5qHO9YDIoqtdhpEPo6hYMAcz6Dpa_oZsRDcsK0yfhSICDmBA= sumitk HIER_DIRECT/77.234.45.65 application/octet-stream
1551728238.234 109983 10.26.2.44 TCP_TUNNEL/200 587 CONNECT c.go-mpulse.net:443 harish HIER_DIRECT/104.81.21.87 -
1551728238.997 60974 10.126.2.11 TCP_MISS/200 319 GET http://su.ff.avast.com/R/A3cKIGRkMDc1NTJkYTEwZDRlMGNiZjMwMWJkZDQxY2I2YzUxEgQCAwMZGHgiAf4qBwgEEK-492sqBwgDEKDT8GYyCggEEK-492sYgAo4spKckAFCIAWim9dpoLM1oNFpiMFNau5nnsWSKWVZZP3OPYtUxXoRSICDKA== mukund HIER_DIRECT/77.234.41.236 application/octet-stream"
my grok looks like:
input {
beats {
port => "5044"
}
}
filter {
grok {
match => { "message" => "%{INT:timestamp}.%{INT}\s*%{NUMBER:request_msec:float} %{IPORHOST:src_ip} %{WORD:cache_result}/%{NUMBER:response_status:int} %{NUMBER:response_size:int} %{WORD:http_method} (%{URIPROTO:http_proto}://)?%{IPORHOST:dst_host}(?::%{POSINT:port})?(?:%{DATA:uri_param})? %{USERNAME:cache_user} %{WORD:request_route}/(%{IPORHOST:forwarded_to}|-) %{GREEDYDATA:content_type}"}
}
geoip {
source => "dst_ip"
}
}
output {
elasticsearch {
hosts => ["10.11.109.7:9200"]
sniffing => true
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY-MM}"
document_type => "%{[@metadata][type]}"
}
}