[2024-10-14T17:10:42,812][WARN ][logstash.filters.json ][main][a6bb14d6041ac08bc38f9492d932ed1692c2b1db5f3653b9bd32f0a6676ed45f] Error parsing json {:source=>"message", :raw=>"2024-10-14T17:10:42+02:00 bts-test daemon.info patroni 722 - - 2024-10-14 17:10:42,902 INFO: no action. I am (bts-test), a secondary, and following a leader (bts-test2)", :exception=>#<LogStash::Json::ParserError: Unexpected character ('-' (code 45)): Expected space separating root-level values
at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 1, column: 6]>}
I dont have json filter ...
And i am running it as service. When i run it as command line it is parsed good but when i change output to ES and Kibana it is not parsed correctly
The error is coming from a json filter, you may have a json filter in some of your pipelines.
How your pipelines.yml looks like? When you run logstash as a service it uses the pipelines.yml file to run the pipelines, per default it will look for *.conf files inside /etc/logstash/conf.d.
If you have multiple conf files in this path, logstash will merge all the files into a big pipeline and every event will pass through every filter and output, so if you have a json filter in one of the pipelines, it may your issue.
# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
# https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html
- pipeline.id: main
path.config: "/etc/logstash/conf.d/*.conf"
I managed to unlink kafka pipelines from conf.d and now my logs are correctly parsed in KIbana, but how to managed to have also kafka piplenies inculed in conf.d directory without error?
This will tell logstash to create a pipeline named main that will merge all *.conf files inside the /etc/logstash/conf.d path.
So, unless you have conditionals in your pipelines, all events from all inputs will pass through all filters and will be sent to all outputs.
If you want the pipelines to be independent from each other you need to configure logstash to run multiple pipelines.
From what you shared it seems that you have 3 pipelines, kafka, mssql_dwh and linux, so you can create a path for each one of them, move the respective files and configure it in pipelines.yml.
So your pipelines.yml would be something like this:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.