Hi!
I have a question about correct setup of logstash in order to be able to send plain text log files in json format to Kafka clusters.
In my logstash.conf I have:
input {
file {
path => "/app/server/default/logs/audit.log"
codec => "json"
tags => ["server_one_tag1"]
}
}
filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:EventTime}%{SPACE}%{WORD:TimeZone}%{SPACE}(%{WORD:MWSLogType}:%{WORD:Severity})%{SPACE}%{SPACE}%-%{SPACE}%{GREEDYDATA:LogMessage}" }
add_tag => ["server_one_tag2"]
}
}
output {
kafka {
bootstrap_servers => "kafka.clusters.com:9092"
topic_id => ["XYZPREPROD"]
}
}
Line from my logfile looks like which is being taken by logstash:
2018-03-08 19:21:01 CET (Audit:INFO) - DeleteEvent; timestamp=1520527979786; username=system/someuser; operation=Item deleted; status=success; relation_is_from=true; is_deliverable=true; remove=/meta/default/task/03000027472; deleted=/meta/default/task/00000327472
After I start logstash in debug, following in output:
[2018-07-03T10:19:13,101][DEBUG][logstash.inputs.file ] each: file grew: /app/server/default/logs/audit.log: old size 10292413, new size 10292575
[2018-07-03T10:19:13,105][DEBUG][logstash.inputs.file ] Received line {:path=>"/app/server/default/logs/audit.log", :text=>"2018-07-03 10:19:12 CEST (Audit:INFO) - LoginFailedEvent; ip-address=12.34.56.789; timestamp=1530605952100; failed_login_user=Administrator; action=Failed Login"}
[2018-07-03T10:19:13,122][ERROR][logstash.codecs.json ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected character ('-' (code 45)): Expected space separating root-level values
at [Source: (String)"2018-07-03 10:19:12 CEST (Audit:INFO) - LoginFailedEvent; ip-address=10.30.58.139; timestamp=1530605952100; failed_login_user=Administrator; action=Failed Login"; line: 1, column: 6]>, :data=>"2018-07-03 10:19:12 CEST (Audit:INFO) - LoginFailedEvent; ip-address=12.34.56.789; timestamp=1530605952100; failed_login_user=Administrator; action=Failed Login"}
[2018-07-03T10:19:13,124][DEBUG][logstash.util.decorators ] inputs/LogStash::Inputs::File: adding tag {"tag"=>"server_one_tag1"}
[2018-07-03T10:19:13,125][DEBUG][logstash.inputs.file ] writing sincedb (delta since last write = 83)
[2018-07-03T10:19:13,240][DEBUG][logstash.pipeline ] filter received {"event"=>{"@version"=>"1", "host"=>"prisma", "message"=>"2018-07-03 10:19:12 CEST (Audit:INFO) - LoginFailedEvent; ip-address=12.34.56.789; timestamp=1530605952100; failed_login_user=Administrator; action=Failed Login", "@timestamp"=>2018-07-03T08:19:13.123Z, "tags"=>["_jsonparsefailure", "server_one_tag1"], "path"=>"/app/server/default/logs/audit.log"}}
[2018-07-03T10:19:13,242][DEBUG][logstash.filters.grok ] Running grok filter {:event=>#LogStash::Event:0x5104ad89}
[2018-07-03T10:19:13,243][DEBUG][logstash.filters.grok ] Event now: {:event=>#LogStash::Event:0x5104ad89}
[2018-07-03T10:19:13,244][DEBUG][logstash.pipeline ] output received {"event"=>{"@version"=>"1", "host"=>"prisma", "message"=>"2018-07-03 10:19:12 CEST (Audit:INFO) - LoginFailedEvent; ip-address=12.34.56.789; timestamp=1530605952100; failed_login_user=Administrator; action=Failed Login", "@timestamp"=>2018-07-03T08:19:13.123Z, "tags"=>["_jsonparsefailure", "server_one_tag1", "_grokparsefailure"], "path"=>"/app/server/default/logs/audit.log"}}
Outout in Kibana looks like:
So what I want to achieve:
In Kibana, instead of "_jsonparsefailure", I want to see fields same way as I divided them in grok section. Also, I want to add some tags, like "cluster1" etc as I have several applications that will write under same topic in Kafka. But nevertheless I add "tags", they never appear in Kibana.