Need grok filter for my custom log files

I have files in a directory where all log patterns are different, I have successfully configured beats with elastic search but nor being able to filter fields or data

my data in one file of a directory looks like
"GET / HTTP/1.1
Host: yahoo.com
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:59.0) Gecko/20100101 Firefox/59.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Cookie: B=fh4crrpd9sf94&b=3&s=e1; ucs=lnct=1525329090; HP=1
Connection: keep-alive
Upgrade-Insecure-Requests: 1

HTTP/1.1 301 Moved Permanently
Date: Thu, 07 May 2017 09:48:52 GMT
Connection: keep-alive
Via: http/1.1 media-router-fp1012.prod.media.gq1.yahoo.com (ApacheTrafficServer [c s f ])
Server: ATS
Cache-Control: no-store, no-cache
Content-Type: text/html
Content-Language: en
X-Frame-Options: SAMEORIGIN
Strict-Transport-Security: max-age=2592000
Location: https://www.yahoo.com/
Content-Length: 8"

and to another file it is different , please help me in parsing my logs

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.