GCP LOGGING TO ELASTIC | security

Hi,

We have elasticsearch running on VMs and we are trying to share the logs from GCP to local elastic cluster.

Thing is, GCP uses service account and key. Is there a way where we can add GCP service account credentials to elastic for authentication/communication.

Thanks!

How exactly is GCP shipping logs to Elasticsearch?

hi,

filebeat container is shipping the logs to beats at port 5044.

I can see logstash pipeling is receiving the event see attached but couldnt see index on elasticsearch :expressionless:

and logstash's logs are showing DLQ errros.

[2023-05-15T12:51:46,305][ERROR][org.logstash.common.io.
DeadLetterQueueWriter][main][1806ca102a69b9f36c2ecba3b92c3fc44b39513306e53c4e4a8e8d205b0e7a22] cannot write event to DLQ(path: /some/path/failed/queue/main): reached maxQueueSize of 1073741824

Thanks!

You will need to share more info then, configs and logs of the various phases.

Hi,

This is the beat pipeline.

input{
  beats {
    port => 5044
    ecs_compatibility => disabled
    type => "filebeat-logs"
  }
}

filter {
   dissect {
     mapping => {
       "message" => "%{timestamp} %{msg} %{} %{?ip} %{event}"
      }
    }
}

and , output conf is

filter {
  mutate {
      add_field => { "[@metadata][index_name]" => "%{[@metadata][type]}" }
  }


}
output {
    elasticsearch {
      user => someuser
      password => somepassword
      hosts => ['https://elkserver:9200','https://elkserver:9200','https://elkserver:9200']
      ssl_certificate_verification => false
      manage_template => false
      data_stream => false
      index => "%{[@metadata][index_name]}-8-%{+YYYY.MM.dd}"
    }
}

and logs are not draining to elasticsearch, it is going to logstash failed events. not sure , why :expressionless:

FYI Logstash will merge both of those together when it runs.

What do your Logstash logs show? What does an event from the DLQ look like?

yep, it merges input pipeline and output.

Here are the logstash logs


2023-05-16T00:46:50,466][WARN ][org.logstash.dissect.Dissector][main][a554ae92dc4c81b47753ce18574db79350e6ba4d3ad250324f0b4002bbcbff55] Dissector mpathing, pattern not found {"field"=>"message", "pattern"=>"%{timestamp} %{msg} %{} %{?ip} %{event}", "event"=>{"ecs"=>{"version"=>"8.0.0"}, "tags"=>["beats_input_codec_plain_pathlied", "_dissectfailure"], "agent"=>{"name"=>"lmn06-pqr-node5", "id"=>"96f20428-c0e1-4524-806c-352b6894a2b0", "version"=>"8.7.1", "ephemeral_id"=>"9f882b81-6283-4091-83aa-8eeef3aa3f49", "type"=>"filebeat"}, "log"=>{"offset"=>4668271, "file"=>{"path"=>"/var/log/containers/abc-synchronizer-lmn06--qat-external-02-228c37vkqzq_abc_abc-synchronizer-d269031156e7001818b334354aa1824d5aae71acd8daa62bb9a7a3df24171806.log"}}, "@version"=>"1", "stream"=>"stderr", "input"=>{"type"=>"container"}, "@timestamp"=>2023-05-16T04:46:44.411Z, "message"=>"{\"level\":\"INFO\",\"thread\":\"abc-Timer-6\",\"mdc\":{},\"className\":\"com.abc.hybrid.runtime.contract.sync.context.ControlPlaneSyncContext\",\"method\":\"cleanupOldVersion\",\"severity\":\"INFO\",\"message\":\"cleaned up older versions\",\"formattedDate\":\"2023-05-16T04:46:44.411Z\",\"logger\":\"CONTRACT-REPLICATION\"}", "host"=>{"architecture"=>"x86_64", "hostname"=>"lmn06-pqr-node5", "name"=>"lmn06-pqr-node5", "os"=>{"kernel"=>"5.15.0-1008-gkeop", "family"=>"debian", "name"=>"Ubuntu", "platform"=>"ubuntu", "codename"=>"focal", "version"=>"20.04.6 LTS (Focal Fossa)", "type"=>"linux"}, "containerized"=>true, "ip"=>[10.17.21.17","fe10::52ff:feb4:136c","129.254.123.1][ERROR][org.logstash.common.io.DeadLetterQueueWriter][main][894a69b2c62154dcaff1ea73397d2649fbe23ca59f2f22c975193f066d57a71f] cannot write event to DLQ(path: /path/logstash/failed/queue/main): reached maxQueueSize of 1073741824

DLQ events

{"type":"filebeat","agent":{"name":"abc06-def-node5","id":"96f20428-c0e1-4524-806c-352b6894a2b0","type":"filebeat","ephemeral_id":"1234467-34343-4091-83aa-8eeef3aa3f49","version":"8.7.1"},"log":{"file":{"path":"/var/log/containers/ingress-gw-5d7dfd96bd-v266q_expi_discovery-5b07578d5e9cd6bd6a7e154367386e3qewqeweqweqeqa948abd98a6.log"},"offset":5339462},"host":{"ip":["10.17.21.17","fe10::52ff:feb4:136c","129.254.123.1"-<<<many IPV6>>>>>"],"name":"abc06-def-node5","os":{"type":"linux","kernel":"5.15.0-1008-gkeop","version":"20.04.6 LTS (Focal Fossa)","family":"debian","name":"Ubuntu","platform":"ubuntu","codename":"focal"},"hostname":"abc06-def-node5"},"index_name":"filebeat","msg":"reload","stream":"stdout","tags":["beats_input_codec_plain_applied","dead_letter"],"@version":"1","timestamp":"{\"severity\":\"Info\",\"timestamp\":\"2023-05-16T02:53:34.55609165Z\",\"logger\":\"monitor\",\"message\":\"Triggering","input":{"type":"container"},"message":"{\"severity\":\"Info\",\"timestamp\":\"2023-05-16T02:53:34.55609165Z\",\"logger\":\"monitor\",\"message\":\"Triggering reload of file configuration\"}","ecs":{"version":"8.0.0"},"@timestamp":"2023-05-16T02:53:34.556Z","event":"configuration\"}"}
@                                                                                                                                                                           @                                                                                                                                                                           @                                                                      ```

Not very much sure but may be grok filter causing this ?

That's likely why, it's not matching the grok pattern.

I tried to remove all the filters, then I see DLQ issue :expressionless:

Since events are already in json format not sure why seeing DLQ issue

I could see this error. anyone know how to fix index name


Could not index event to Elasticsearch. status: 400, action: ["index", {:_id=>nil, :_index=>"_doc,abcIndex-8-2023.05.17", :routing=>nil}

status"=>400, "error"=>{"type"=>"invalid_index_name_exception", "reason"=>"Invalid index name [_doc,abcIndex-8-2023.05.17], must not contain the following characters [>, |, ,, /, \\,  

Issue is resolved. for the beat, id type should be beat.

input{
  beats {
    port => 5044
    type => "filebeat"
  }
}

filter {
  mutate {
    remove_field => "[ip]" 
  }
}
filter {
  mutate {
    remove_field => "[mac]"
  }
}

output {
    elasticsearch {
      user => someuser
      password => somepassword
      hosts => ['https://someServer:9200']
      ssl_certificate_verification => false
      manage_template => false
      data_stream => false
      #index => "%{[@metadata][index_name]}-8-%{+YYYY.MM.dd}"
      index => "someIndexName-8-%{+YYYY.MM.dd}"
    }
}

since that server had multiple pipelines which were causing too many events and causing DLQ issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.