I have long entry in my text file (just a string), which cannot be controlled, and there are a lot of spaces. So is there a possibility to remove them before pushing to Kafka? I need just simple grok filter with Greedydata.
So config is:
input {
file {
path => "/app/logs/somelong.log"
type => "my_type_1"
tags => ["xxx","yyy"]
}
}
filter {
grok {
match => { "message" => "%{GREEDYDATA:LogMessage}"}
remove_field => ["message"]
}
}
output {
kafka {
codec => "json"
bootstrap_servers => "my.kafka.net:9092"
topic_id => ["SOMETOPIC"]
}
}
At this moment, I dont need to divide following string to several fields, just need to trim all spaces in that log file. Unfortunatelly, while pasting it here, all spaces removed, so I am just adding them to represent:
XYZSESSION<5 spaces here>b43e0d68-6cd9-4299-8cv 99-cefac7b940a0<10 spaces here>b43e0d68-6cd9-4299-8c99-ceyiifac7b940a0<2 spaces here>5865za7a68-1a9a-ea38-a520-46dbe35e0784<3 spaces here>myserver:5195 2018-07-03T22:00:00.31Z<20 spaces here>2018-07-03T22:00:00.31Z