Hi there,
I am trying ingest data from a log file and using grok to filter it and writing in a log file.
Here is input log file content
"122.164.121.231 - - [23/Nov/2015:07:19:54 -0500] "GET /configs/config/thecelloserenades@gmail.com HTTP/1.1" 200 1135 "http://localhost/octoviz-auth/public/octoviz/admin/admin.html" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:39.0) Gecko/20100101 Firefox/39.0
Here is logstash config file.
input {
file {
path => "C:/EVRY/evry/projects/workingProjects/ELK-searchEngine/tomcatlog.log"
start_position => "beginning"
sincedb_path => "NUL"
type => "logs"
}
}
filter {
grok{
match=>{
"message"=>"%{IP:clientip} - - [%{NOTSPACE:date} -%{INT}] "%{WORD:action} /%{WORD}/%{WORD}/%{NOTSPACE:login} %{WORD:protocol}/%{NUMBER:protocolNum}" %{NUMBER:status} %{NUMBER} "%{NOTSPACE}" "%{NOTSPACE:client} (%{WORD}; %{WORD:clientOs}%{GREEDYDATA}"
}
add_field=>{
"eventName"=>"grok"
}
}
geoip {
source => "clientip"
}
}
output {
file {
path => "C:/EVRY/evry/projects/workingProjects/ELK-searchEngine/tomcatlogout.log"
}
stdout {}
}
this is what output i am looking for in output log file
{
"clientip": [
[
"122.164.121.231"
]
],
"IPV6": [
[
null
]
],
"IPV4": [
[
"122.164.121.231"
]
],
"date": [
[
"23/Nov/2015:07:19:54"
]
],
"INT": [
[
"0500"
]
],
"action": [
[
"GET"
]
],
"WORD": [
[
"configs",
"config",
"X11"
]
],
"login": [
[
"thecelloserenades@gmail.com"
]
],
"protocol": [
[
"HTTP"
]
],
"protocolNum": [
[
"1.1"
]
],
"BASE10NUM": [
[
"1.1",
"200",
"1135"
]
],
"status": [
[
"200"
]
],
"NUMBER": [
[
"1135"
]
],
"NOTSPACE": [
[
"http://localhost/octoviz-auth/public/octoviz/admin/admin.html"
]
],
"client": [
[
"Mozilla/5.0"
]
],
"clientOs": [
[
"Ubuntu"
]
],
"GREEDYDATA": [
[
"; Linux x86_64; rv:39.0) Gecko/20100101 Firefox/39.0"
]
]
}
Here is logstash console log:
Sending Logstash logs to C:/EVRY/evry/projects/workingProjects/ELK-searchEngine/logstash-6.5.3/logstash-6.5.3/logs which is now configured via log4j2.properties
[2019-01-07T16:35:53,816][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-01-07T16:35:53,847][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.5.3"}
[2019-01-07T16:35:58,983][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-01-07T16:35:59,855][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x4526df11 run>"}
[2019-01-07T16:35:59,902][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-01-07T16:35:59,933][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-01-07T16:36:00,261][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
But in output log file is blank. it is not writing anything in output log file. please suggest what wrong i am doing. Am i need to install grok or some plugin in logstash?
Thanks in advance.