Prune Plugin error

Hello,

I am trying to use Prune filter to parse the data because the data format is foo:bar, but I am unsuccessful. I am getting the below error.
The configuration is working without the prune filter. Please advise on how to fix it.

Error

*[**ERROR**] 2022-08-03 14:52:09.275 [Converge PipelineAction::Reload<main>] agent - Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Expected one of [ \\t\\r\\n], \"#\", [A-Za-z0-9_-], '\"', \"'\", [A-Za-z_], \"-\", [0-9], \"[\", \"{\" at line 63, column 6 (byte 1406) after filter {\n\tgrok {\n\t \tmatch => {\"message\" => \"%{SYSLOG5424PRI}%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname}%{GREEDYDATA:message}\"}\n\t\toverwrite => [\"message\"]\n\t}\n\tmutate {\n\t\tremove_field => [\"@version\",\"@timestamp\",\"event\",\"log\",\"syslog\",\"priority\"]\n\t}\n\tprune {\n\t\t    whitelist_names => [\n\t\t\t\t\t\t\"adf\",\n\t\t\t\t\t\t\"significant\",\n\t\t\t\t\t\t\"udf\",\n\t\t\t\t\t\t\"virtualservice\",\n\t\t\t\t\t\t\"vs_ip\",\n\t\t\t\t\t\t\"client_ip\",\n\t\t\t\t\t\t\"client_src_port\",\n\t\t\t\t\t\t\"client_dest_port\",\n\t\t\t\t\t\t\"start_timestamp\",\n\t\t\t\t\t\t\"report_timestamp\",\n\t\t\t\t\t\t\"total_time\",\n\t\t\t\t\t\t\"connection_ended\",\n\t\t\t\t\t\t\"client_rtt\",\n\t\t\t\t\t\t\"mss\",\n\t\t\t\t\t\t\"rx_bytes\",\n\t\t\t\t\t\t\"tx_bytes\",\n\t\t\t\t\t\t\"rx_pkts\",\n\t\t\t\t\t\t\"tx_pkts\",\n\t\t\t\t\t\t\"out_of_orders\",\n\t\t\t\t\t\t\"retransmits\",\n\t\t\t\t\t\t\"timeouts\",\n\t\t\t\t\t\t\"zero_window_size_events\",\n\t\t\t\t\t\t\"service_engine\",\n\t\t\t\t\t\t\"vcpu_id\",\n\t\t\t\t\t\t\"pool\",\n\t\t\t\t\t\t\"pool_name\",\n\t\t\t\t\t\t\"server_ip\",\n\t\t\t\t\t\t\"server_name\",\n\t\t\t\t\t\t\"server_conn_src_ip\",\n\t\t\t\t\t\t\"server_dest_port\",\n\t\t\t\t\t\t\"server_src_port\",\n\t\t\t\t\t\t\"server_rtt\",\n\t\t\t\t\t\t\"server_total_bytes\",\n\t\t\t\t\t\t\"server_rx_bytes\",\n\t\t\t\t\t\t\"server_tx_bytes\",\n\t\t\t\t\t\t\"server_total_pkts\",\n\t\t\t\t\t\t\"server_rx_pkts\",\n\t\t\t\t\t\t\"server_tx_pkts\",\n\t\t\t\t\t\t\"server_out_of_orders\",\n\t\t\t\t\t\t\"server_retransmits\",\n\t\t\t\t\t\t\"server_timeouts\",\n\t\t\t\t\t\t\"server_zero_window_size_events\",\n\t\t\t\t\t\t\"protocol\",\n\t\t\t\t\t\t\"persistence_used\",\n\t\t\t\t\t\t\"vs_name\",\n\t\t\t\t\t", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in `compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:199:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:72:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/reload.rb:51:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:381:in `block in converge_state'"]}*

Sample event:

<38>Jul 29 10:28:49 192.168.153.101 {"adf": false,"significant":0,"udf":false,"virtualservice":"virtualservice-210234d8-3c75-45da-a044-4bd6a34a8df4","vs_ip":"10.239.9.18","client_ip":"128.32.69.16","client_src_port":49864,"client_dest_port":80,"start_timestamp":"2022-07-29T08:28:49.289389Z","report_timestamp":"2022-07-29T08:28:49.337312Z","total_time":47,"connection_ended":true,"client_rtt":0,"mss":1500,"rx_bytes":97,"tx_bytes":3250,"rx_pkts":1,"tx_pkts":3,"out_of_orders":0,"retransmits":0,"timeouts":0,"zero_window_size_events":0,"service_engine":"192-168-153-101","vcpu_id":0,"log_id":2054665,"pool":"pool-eea867f2-9e50-4819-9009-ace148456fee","pool_name":"VS_PROXY_BIZ-pool","server_ip":"153.239.62.204","server_name":"153.239.62.204","server_conn_src_ip":"10.113.255.22","server_dest_port":80,"server_src_port":31528,"server_rtt":0,"server_total_bytes":0,"server_rx_bytes":3250,"server_tx_bytes":97,"server_total_pkts":0,"server_rx_pkts":3,"server_tx_pkts":1,"server_out_of_orders":0,"server_retransmits":0,"server_timeouts":0,"server_zero_window_size_events":0,"protocol":"PROTOCOL_TCP","persistence_used":true,"vs_name":"VS_PROXY_BIZ"}

Below is the Configuration file
Configuration File:

input {
        tcp {
                #host => "10.1.0.5"
                port => 1555 #mention the port
                #type => syslog
        }
}
filter {
        grok {
                match => {"message" => "%{SYSLOG5424PRI}%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname}%{GREEDYDATA:message}"}
                overwrite => ["message"]
        }
        mutate {
                remove_field => ["@version","@timestamp","event","log","syslog","priority"]
        }
        prune {
                    whitelist_names => [
                                                "adf",
                                                "significant",
                                                "udf",
                                                "virtualservice",  (#there are a lot more fields, I have omitted them)
                                           ]
output {
        stdout {}
}

--
Thanks in advance.
Siddarth

I do not see anything in your filters that would parse the [message] field to extract all the fields like [adf], [significant], etc. In which case the prune filter will delete the [message] field and you may end up with an empty event.

I modified a Prune Filter

 prune {
                    whitelist_names => [
                                        "message",
                                        "adf"
                                        ]
}

the error is gone, however, I get a single line output like below,

{
    "message" => " {\"adf\": false,\"significant\":0,\"udf\":false,\"virtualservice\":\"virtualservice-210234d8-3c75-45da-a044-4bd6a34a8df4\",......
}

How can I extract the data inside message field? like adf, significant, etc...
Please advise.

Thanks in advance.
Siddarth

Use a json filter to parse it.

@Badger , thank you. The proposition worked.