Not getting data into Elasticsearch

Hi all,

i'm sending logs using filebeat but i don't receive anything in kibana
x-pack is enabled

I created a logstash writer role

> POST _xpack/security/role/logstash_writer
> {
> "cluster": ["manage_index_templates", "monitor"],
> "indices": [
> {
> "names": [ "filebeat-*" ],
> "privileges": ["write","delete","create_index"]
> }
> ]
> }

Then a logstash internal user:

> POST _xpack/security/user/logstash_internal
> {
> "password" : "changeme",
> "roles" : [ "logstash_writer"],
> "full_name" : "Internal Logstash User"
> }

I then run the following command

> systemctl status -l logstash`

> Jun 27 15:29:10 sd-131865 logstash[23755]: [2019-06-27T15:29:10,956][INFO ][logstash.outputs.elasticsearch] retrying failed action with response code: 403 ({"type"=>"security_exception", "reason"=>"action [indices:admin/create] is unauthorized for user [logstash_internal]"})
> Jun 27 15:29:10 sd-131865 logstash[23755]: [2019-06-27T15:29:10,958][INFO ][logstash.outputs.elasticsearch] Retrying individual bulk actions that failed or were rejected by the previous bulk request. {:count=>125}

what am i doing wrong!

This is my .conf files
02-beats-input.conf

input {
  beats {
    port => 5044
 }
}

10-syslog-filter.conf

filter {
     if [source] =~ "gid" {
        grok {
            match => {"message" => "%{TIMESTAMP_ISO8601:timestamp}\|(\[%{BASE10NUM:nbr}\])\|%{IPORHOST:ClientIP}\|%{USERNAME:User};%{DATA:Node}\|%{URIPATH:Url}\|%{NOTS$
        }
     }else if [source] =~ "server" {

        grok {
            match =>{ "message" => [

                "(?m)%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:Loglevel} \[(?<Classname>[^\]]+)\] %{GREEDYDATA:Message}",
                "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:Loglevel}  \[(?<Classname>[^\]]+)\] %{WORD:n} \| %{NUMBER:number} \| %{WORD:b} \| %{DATA:Url} \| %{GREEDYDAT$
                "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:Loglevel}  \[(?<Classname>[^\]]+)\] %{GREEDYDATA:Message}"] }
        }
      }
  date {match => [ "timestamp" , "MMM dd yyyy HH:mm:ss","MMM d yyyy HH:mm:ss", "ISO8601" ]
      target => "@timestamp"}

  mutate {
        remove_field => [ "[beat][name]", "[beat][version]", "[beat][hostname]", "[host][architecture]", "[host][containerized]", "[host][id]", "[host][name]", "[host]$
  }
}

30-elasticsearch-output.conf

output {
  elasticsearch {
    hosts => ["myIpAdress:9200"]
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    user => "logstash_internal"
    password => "admin10@"
    document_type => "%{[@metadata][type]}"
  }
}

You should try to add a stdout output to see if anything is produced.

Some thoughts: why using logstash in that case? Elasticsearch ingest feature should be ok to do the filter part you want.
So connect filebeat to elasticsearch directly and define the pipeline to use.

stdout added, but receiving the operating system logs in elastic

bad, i have my own log files to send with the path defined in filebeat

What is a typical document you can see on stdout?

sorry this is the first time using the stdout; how can i see those documents

In the console.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.