Save custom logs with logstash

I'm new to ELK.

I have many containers from which I want to collect logs using Logstash.

My custom log looks like this:

{'agent_id': 99, 'name': 'Cora', 'timestamp': '2024-08-21 06:26:09', 'operation_name': 'Do_smthing'}

This is my Logstash configuration:

input {
	udp {
        port => 5044
        codec => "json"
        }
}

filter {
match => { 
    "message" => {"<%{NONNEGINT:priority}>%{SYSLOGTIMESTAMP:timestamp} %{DATA:hostname}\[%{POSINT:pid}\]: %{GREEDYDATA:msg}"}  
    remove_field => ["message"]
  }   
  mutate { 
    gsub => ["msg", "'", "\""]
  }  
  json { 
    source => "msg"  remove_field => ["msg"]
  }   
  mutate { 
    rename => {"[agent_id]" => "@agent_id"  "[name]" => "@name"  "[timestamp]" => "@json_timestamp" "[operation_name]" => "@operation_name"}
  }
}
  
output {
    elasticsearch {
        index => "logstash-test"
        hosts => ["https://es01:9200"]
        user => "elastic"
        password => "pass"
        ssl_enabled => true
        cacert => "/usr/share/logstash/certs/ca/ca.crt"
    }

    stdout {}
}

When I view logs with docker logs, they appear correct and as I want them to be. However, after collecting them with Logstash, they look like this:

<30>Aug 21 07:16:51 83c3f7c071fb[45680]: {'agent_id': 99, 'name': 'Cora', 'timestamp': '2024-08-21 06:26:09', 'operation_name': 'Do_smthing'}

My first questions is how to run docker container or config logstash correctly to collect what i want.

My second question is how to create custom fields identical to my log. In other words, how can I save agent_id in an agent_id field, and so on? I don't want to see all of them in the message field.".

note: I don't want to read from container log files (there are many container and just want to see some specific container). I want to run container with the ability of sending logs to logstash

That's fine, but you have told docker to log using syslog. If you do that then you need to configure logstash to receive syslog messages.

You cannot use a json codec on the input because at that point there is syslog framing around the JSON, and, as you noticed yourself, the use of single quotes means it is not valid JSON.

You can use a syslog input which will do the grok for you (and parse the resulting fields).

input { syslog {  port => 5044 } }
filter {
    mutate { gsub => [ "message", "'", '"' ] }
    json { source => "message" }
}

will result in

       "process" => {
    "name" => "83c3f7c071fb",
     "pid" => 45680
},
"operation_name" => "Do_smthing",
       "service" => {
    "type" => "system"
},
          "name" => "Cora",
      "agent_id" => 99,
           "log" => {
    "syslog" => {
        "facility" => {
            "code" => 3,
            "name" => "system"
        },
        "severity" => {
            "code" => 6,
            "name" => "Informational"
        },
        "priority" => 30
    }
},
     "timestamp" => "2024-08-21 06:26:09",
          "host" => {
    "ip" => "127.0.0.1"
}