I'm new to ELK.
I have many containers from which I want to collect logs using Logstash.
My custom log looks like this:
{'agent_id': 99, 'name': 'Cora', 'timestamp': '2024-08-21 06:26:09', 'operation_name': 'Do_smthing'}
This is my Logstash configuration:
input {
udp {
port => 5044
codec => "json"
}
}
filter {
match => {
"message" => {"<%{NONNEGINT:priority}>%{SYSLOGTIMESTAMP:timestamp} %{DATA:hostname}\[%{POSINT:pid}\]: %{GREEDYDATA:msg}"}
remove_field => ["message"]
}
mutate {
gsub => ["msg", "'", "\""]
}
json {
source => "msg" remove_field => ["msg"]
}
mutate {
rename => {"[agent_id]" => "@agent_id" "[name]" => "@name" "[timestamp]" => "@json_timestamp" "[operation_name]" => "@operation_name"}
}
}
output {
elasticsearch {
index => "logstash-test"
hosts => ["https://es01:9200"]
user => "elastic"
password => "pass"
ssl_enabled => true
cacert => "/usr/share/logstash/certs/ca/ca.crt"
}
stdout {}
}
When I view logs with docker logs, they appear correct and as I want them to be. However, after collecting them with Logstash, they look like this:
<30>Aug 21 07:16:51 83c3f7c071fb[45680]: {'agent_id': 99, 'name': 'Cora', 'timestamp': '2024-08-21 06:26:09', 'operation_name': 'Do_smthing'}
My first questions is how to run docker container or config logstash correctly to collect what i want.
My second question is how to create custom fields identical to my log. In other words, how can I save agent_id in an agent_id field, and so on? I don't want to see all of them in the message field.".
note: I don't want to read from container log files (there are many container and just want to see some specific container). I want to run container with the ability of sending logs to logstash