JSON filter plugin

I want to configure a JSON filter plugin in my ELK stack, but after configuring the following files, I don't see any data in Kibana. I'm not sure if I need to configure any additional files. Can anyone help me?


  • logstash.yml

# The number of threads to use for event processing (default is the number of CPU cores)

pipeline.batch.size: 500

sipeline.batch.delay: 5

#determines how Logstash buffers events

pipeline.buffer.type: direct # Or 'direct', based on your requirements pipeline.ordered: false

# Activate the "dead letter queue" (DLQ) feature for handling failed events

dead_ letter_queue. enable: true

# Path to the dead letter queue

#dead_letter_queue.path: "/var/lib/logstash/dead_letter_queue"

# Queue settings (optional)

#queue. type: "memory" # Options are "memory" or "persisted"

#queue memory. size: 1024mb # Only if using memory queue

# Paths for the configuration files

path.config: "/tmp/logstash-test/logstash.conf"

# Path for the log files

ath dogs: "/var/109/logstash

* Enable/disable logging log. level: "info"

# Options: debug, info, warn, error, fatal


-logstash.conf


input {

file{

path => "/tmp/test.log" start_position =>

"beginning"

sincedb_path => "/dev/null"

codec => json

logstash.conf *

}

}

filter {

json {

source => "message" target => "json_data"

# This will store the parsed>

}

}

output {

elasticsearch {

hosts => ["'http://localhost:9200"] index => "logstash-logs-%{+YYYY.MM.dd}"

}
}


test.log

["test": "data"}


After running the following command, the terminal freezes, but there are no errors, only warnings.



j-VirtualBox: / tmp/logstash-test$ 
sudo /us/share/logstash/bin/logstash -f/tmp/logstash-test/logstash.conf

Using bundled JDK: /us/share/logstash/jdk

Could not find log4j2 configuration at path /us/share/logstash/config/log4j2.properties. Using default config which log s enrors to the console

[WARN J 2024-12-27 02:45:05.644 [main] runner - Starting from version 9.0, running with superuser privileges is not perm itted unless you explicitly set

'allow_superuser" to true, thereby acknowledging the possible security risks

[WARN ] 2024-12-27 02:45:05.655 [main] runner - NOTICE: Running Logstash as a superuser is strongly discouraged as it po ses a security risk.

Set

"allow_superuser' to false for better security.

[INFO ]

2024-12-27 02:45:05.724 [main] runner - Starting Logstash {"logstash.version"=>"8.17.0", "jruby.version"=>"jruby

9.4.9.0 (3.1.4) 2024-11-04 547c6150e OpenJDK 64-Bit Server VM 21.0.5+11-LTS on 21.0.5+11-LTS +indy +jit [x86_64- linux]

[INFO ] 2024-12-27 02:45:05.799 [main] runner - JVM bootstrap flags: [-Xms1g,

* Xmx1g,
* Djava.awt.headless=true,

-Dile.e

coding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOut0fMemoryError, -Djava.security.egd=file:/dev/urando

n, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000,

* Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.ena bleADS=true,
* -add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.ja

vac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com. sun. tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base /java.security=ALL-UNNAMED,

--add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAM

ED,

--add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.all ocator.maxOrder=11]

[INFO] 2024-12-27 02:45:06.030 [main] StreamReadConstraintsUtil - Jackson default value override logstash. jackson.stre am-read-constraints.max-string-length'

configured to 200000000*

[INFO ]

2024-12-27 02:45:06.033 [main] StreamReadConstraintsUtil - Jackson default value override 'logstash. jackson.stre

am-read-constraints.max-number-length

configured to ‘10000’

[WARN ] 2024-12-27 02:45:06.275 [LogStash: :Runner] multilocal - Ignoring the 'pipelines.yl' file because modules or com mand line options are specified

[INFO] 2024-12-27 02:45:07.836 [Api Webserver] agent - Successfully started Logstash API endpoint {: port=>9602, :ssl_en abled=>false}


A few questions:

  1. I see that you enabled the DLQ, but are not processing it in another pipeline, any reason why? (it might be that the failed event gets send to the dlq that is not there)
  2. The test log you are sending as event sets the value "data" to field "test" is this correct? (I think the opening [ is a typo, but just in case check the json syntax)

The json filter is trying to find field "message" and parse the json within that field. Example for test.log:

{
  "message": "{\n  \"test\": \"data\"\n}"
}

By using codec you have already told logstash that it should expect the file to consist of JSON. The filter is double. Just as the example above there is a json string in a json object than the filter could be useful.

Hopes this helps!