Elastic search parsing error

Hello, i have same problem as Logstash json parse error, but i am not using any codec => json or not handling json data. But when i use

output {

    if [type] == "mssql-audit-dwh" {

        stdout {
            codec => "rubydebug"
        }

    }
}

i got nice parsed data , and when i change it to

output {

    if [type] == "mssql-audit-dwh" {

        elasticsearch {
            hosts => [ "ip", "ip" ]
            index => "mssql-dwh"
            user => "user"
            password => "pw"
        }

    }
}

it is not parsed and it is saying grok parser failure....

ERROR

:exception=>#<LogStash::Json::ParserError: Unexpected character ('-' (code 45)): Expected space separating root-level values
 at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 1, column: 6]>}

this is my other pipeline with same error

input {
        file {
                path => ["/var/log/remote10514"]
                type => "linux"
                start_position => "beginning"
                codec => "plain"
                mode => "tail"
        }

}
filter {
  grok {
    match => { "message" => "%{SYSLOGTIMESTAMP:log_timestamp} %{SYSLOGHOST:hostname} %{DATA:program}\[%{POSINT:pid}\]: %{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log_level}: %{GREEDYDATA:error_message}" }
       }
mutate {
     replace => { "host" => "%{[host][name]}" }
       }
}
output {

    if [type] == "linux" {

        elasticsearch {
            hosts => ["ip"]
            index => "linux"
            user => "user"
            password => "pw"
        }
    }
}

image

Full error log :

[2024-10-14T17:10:42,812][WARN ][logstash.filters.json    ][main][a6bb14d6041ac08bc38f9492d932ed1692c2b1db5f3653b9bd32f0a6676ed45f] Error parsing json {:source=>"message", :raw=>"2024-10-14T17:10:42+02:00 bts-test daemon.info patroni 722 - -  2024-10-14 17:10:42,902 INFO: no action. I am (bts-test), a secondary, and following a leader (bts-test2)", :exception=>#<LogStash::Json::ParserError: Unexpected character ('-' (code 45)): Expected space separating root-level values
 at [Source: REDACTED (`StreamReadFeature.INCLUDE_SOURCE_IN_LOCATION` disabled); line: 1, column: 6]>}

With ruby debug output:

And Kibana output:
image

Your error expclitly says that you have a json filter.

[2024-10-14T17:10:42,812][WARN ][logstash.filters.json ][main][a6bb14d6041ac08bc38f9492d932ed1692c2b1db5f3653b9bd32f0a6676ed45f] Error parsing json

This error comes from a json filter.

How are you running logstash? As a service? As a command line?

I dont have json filter ...
And i am running it as service. When i run it as command line it is parsed good but when i change output to ES and Kibana it is not parsed correctly

filter {
    grok {
        match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} %{HOSTNAME:hostname} %{DATA:process} %{DATA:application} %{NUMBER:pid} - -  %{DATA:timestamp2} %{LOGLEVEL:log_level}: %{DATA:action}\. %{GREEDYDATA:rest}"]
    }

    # remove unnecessary fields

    mutate {
        remove_field => ["[event][original]"]
        replace => { "host" => "%{[host][name]}" }
        remove_field => [ "_score", "@version" ]
    }
}

The error is coming from a json filter, you may have a json filter in some of your pipelines.

How your pipelines.yml looks like? When you run logstash as a service it uses the pipelines.yml file to run the pipelines, per default it will look for *.conf files inside /etc/logstash/conf.d.

If you have multiple conf files in this path, logstash will merge all the files into a big pipeline and every event will pass through every filter and output, so if you have a json filter in one of the pipelines, it may your issue.

This is my pipeline:

# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
#   https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html

- pipeline.id: main
  path.config: "/etc/logstash/conf.d/*.conf"

This is my conf.d files.

/etc/logstash/conf.d $ ll
total 8
drwxr-xr-x 2 root root 4096 Oct 14 17:25 .
drwxr-xr-x 5 root root 4096 Oct 10 09:26 ..
lrwxrwxrwx 1 root root   25 Oct 14 14:10 filter-kafka.conf -> ../base/filter-kafka.conf
lrwxrwxrwx 1 root root   25 Oct 11 15:43 filter-linux.conf -> ../base/filter-linux.conf
lrwxrwxrwx 1 root root   29 Oct 14 14:10 filter-mssql_dwh.conf -> ../base/filter-mssql_dwh.conf
lrwxrwxrwx 1 root root   24 Oct 14 14:10 input-kafka.conf -> ../base/input-kafka.conf
lrwxrwxrwx 1 root root   24 Oct 11 14:08 input-linux.conf -> ../base/input-linux.conf
lrwxrwxrwx 1 root root   28 Oct 14 14:10 input-mssql-dwh.conf -> ../base/input-mssql-dwh.conf
lrwxrwxrwx 1 root root   25 Oct 14 14:10 output-kafka.conf -> ../base/output-kafka.conf
lrwxrwxrwx 1 root root   25 Oct 11 14:08 output-linux.conf -> ../base/output-linux.conf
lrwxrwxrwx 1 root root   29 Oct 14 14:10 output-mssql-dwh.conf -> ../base/output-mssql-dwh.conf

Only filter-kafka.conf: json { contains json.
But i dont think it is something wrong with this :

 # parse message
    json {
        source => "message"
              }

Or?

Thanks @leandrojmp,

I managed to unlink kafka pipelines from conf.d and now my logs are correctly parsed in KIbana, but how to managed to have also kafka piplenies inculed in conf.d directory without error?

This will tell logstash to create a pipeline named main that will merge all *.conf files inside the /etc/logstash/conf.d path.

So, unless you have conditionals in your pipelines, all events from all inputs will pass through all filters and will be sent to all outputs.

If you want the pipelines to be independent from each other you need to configure logstash to run multiple pipelines.

From what you shared it seems that you have 3 pipelines, kafka, mssql_dwh and linux, so you can create a path for each one of them, move the respective files and configure it in pipelines.yml.

So your pipelines.yml would be something like this:

- pipeline.id: linux
  path.config: "/etc/logstash/conf.d/linux/*.conf"

- pipeline.id: kafka
  path.config: "/etc/logstash/conf.d/kafka/*.conf"

- pipeline.id: mssql
  path.config: "/etc/logstash/conf.d/mssql/*.conf"

Thanky you, very much! I will try this. :slight_smile: