Logstash Pipeline Erro

Hi there, first time trying to use logstash to capture logs from PfSense.

This is what im using in the pipeline:

input {
    syslog {
        port => 5514
        type => "syslog"
    }
}

filter {
    if [type] == "syslog" {
        grok {
            match => {
                "message" => "%{SYSLOGTIMESTAMP:timestamp} %{SYSLOGHOST:hostname} %{DATA:process}: %{GREEDYDATA:log_message}"
            }
        }
        date {
            match => [ "timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
            target => "@timestamp"
        }
    }
}

output {
    elasticsearch {
        hosts => ["https://192.168.20.180:9200"]
        index => "pfsense-ntopng-logs-%{+YYYY.MM.dd}"
        user => "elastic"
        password => **"password"**
        ssl => true
        cacert => "/etc/elasticsearch/certs/http_ca.crt"
    }
    stdout { codec => rubydebug }
}

I`m getting the following error:

[2024-10-07T17:42:31,172][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.15.2", "jruby.version"=>"jruby 9.4.8.0 (3.1.4) 2024-07-02 4d41e55a67 OpenJDK 64-Bit Server VM 21.0.4+7-LTS on 21.0.4+7-LTS +indy +jit [x86_64-linux]"}
[2024-10-07T17:42:31,176][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-10-07T17:42:31,180][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-10-07T17:42:31,180][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-10-07T17:42:32,279][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-10-07T17:42:33,817][INFO ][org.reflections.Reflections] Reflections took 198 ms to scan 1 urls, producing 138 keys and 481 values
[2024-10-07T17:42:34,693][ERROR][logstash.outputs.elasticsearch] Invalid setting for elasticsearch output plugin:

  output {
    elasticsearch {
      # This setting must be a path
      # File does not exist or cannot be opened /etc/elasticsearch/certs/http_ca.crt
      cacert => "/etc/elasticsearch/certs/http_ca.crt"
      ...
    }
  }

it seen's to be the path to the certificate but i already checked

root@kibana:/etc/elasticsearch/certs# ls -l /etc/elasticsearch/certs/http_ca.crt
-rwxrwxrwx 1 root elasticsearch 1915 Oct 7 13:46 /etc/elasticsearch/certs/http_ca.crt

can someone help?

What does ls -ld /etc /etc/elasticsearch /etc/elasticsearch/certs produce?

This is probably a permission issue.

Are you running logstash as a service, right? When you run logstash as a service it runs under the logstash user and this user does not have permission to read anything inside /etc/elasticsearch.

You need to fix the permissions and make sure that the logstash user can read the file.

You may add the logstash user into to the elasticsearch group or maybe create a new directory path owned by the logstash user.

It worked, I fixed the permissions and now it is collecting, thanks