Kafka logstash logs are showing into filebeat logstash index

Hello,

I have kafka-logstash conf and logstash reciveing the logs from kafka. here is the config.

input {
    kafka {
        topics => ["sitlogtopic","locallogtopic"]
        bootstrap_servers => "ddr-kafkadev.pvt.ccilindia.com:9092"
        #auto_offset_reset => "earliest"
        #consumer_threads => 1
        #decorate_events => true
    }
}
filter{
   grok { match =>  { "message" => ["%{DATESTAMP:Timestamp}  %{LOGLEVEL:Loglevel} %{WORD:hostname} (?<SpringAppName>%{WORD}\-%{WORD}) %{GREEDYDATA:MESSAGE}",
                                 "%{DATESTAMP:Timestamp}  %{LOGLEVEL:Loglevel} %{WORD:hostname} %{WORD:SpringAppName} %{GREEDYDATA:MESSAGE}","%{DATESTAMP:Timestamp}  %{LOGLEVEL:Loglevel} %{WORD:HOSTNAME} (?<SpringAppName>%{WORD}\ %{WORD}) %{GREEDYDATA:MESSAGE}","%{DATESTAMP:Timestamp}  %{LOGLEVEL:Loglevel} (?<hostname>%{WORD}\-%{WORD}\-%{WORD}) %{WORD:SpringAppName} %{GREEDYDATA:MESSAGE}","%{DATESTAMP:Timestamp} %{LOGLEVEL:Loglevel} (?<hostname>%{WORD}\-%{WORD}\-%{WORD}) (?<SpringAppName>%{WORD}\ %{WORD}) %{GREEDYDATA:MESSAGE}","%{DATESTAMP:Timestamp} %{LOGLEVEL:Loglevel} (?<hostname>%{WORD}\-%{WORD}\-%{WORD}) %{WORD:SpringAppName} %{GREEDYDATA:MESSAGE}","%{DATESTAMP:Timestamp}  %{LOGLEVEL:Loglevel} (?<hostname>%{WORD}\-%{WORD}\-%{WORD}\-%{WORD}\-%{WORD}\-%{WORD}) (?<SpringAppName>%{WORD}\ %{WORD}) %{GREEDYDATA:MESSAGE}","(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}) %{LOGLEVEL:level} %{WORD:hostname} (?<SpringAppName>%{WORD}) (?<MESSAGE>(.|\r|\n)*)","%{DATESTAMP:Timestamp} %{LOGLEVEL:Loglevel} (?<hostname>%{WORD}\-%{WORD}\-%{WORD}\-%{WORD}\-%{WORD}\-%{WORD}) (?<SpringAppName>%{WORD}\ %{WORD}) %{GREEDYDATA:MESSAGE}","%{DATESTAMP:Timestamp}  %{LOGLEVEL:Loglevel} (?<hostname>%{WORD}\-%{WORD}) (?<SpringAppName>%{WORD}\ %{WORD}) %{GREEDYDATA:MESSAGE}","%{DATESTAMP:Timestamp} %{LOGLEVEL:Loglevel} %{WORD:hostname} (?<SpringAppName>%{WORD}\-%{WORD}) (?<stacktrace>(.|\r|\n)*)","%{DATESTAMP:Timestamp}  %{LOGLEVEL:Loglevel} (?<hostname>%{WORD}\-%{WORD}\-%{WORD}\-%{WORD}\-%{WORD}\-%{WORD}) (?<SpringAppName>%{WORD}\-%{WORD}) %{GREEDYDATA:MESSAGE}","%{DATESTAMP:Timestamp}  %{LOGLEVEL:Loglevel} (?<hostname>%{WORD}\-%{WORD}-%{WORD}) (?<SpringAppName>%{WORD}\-%{WORD}) %{GREEDYDATA:MESSAGE}"]}
         }
   #mutate { add_field => { "SpringAppName" => "%{Ap}%{Service}" } }
   mutate {
    remove_field => ['message','event.original']
  }
}

output {

    elasticsearch {
            hosts => ["https://elastic-uat.ccilindia.net:9200"]
            #index => "kafkadev-%{+yyyy.MM.dd}"
            ilm_rollover_alias => "kafkadev"
            ilm_pattern => "000001"
            ilm_policy => "kafkadev"
            ilm_enabled => true
            cacert => "/etc/logstash/certs/GeoTrust-RSA-CA-Intermediate-2018.pem"
            user => "elastic"
            password => "Ccil@2023"
            ssl => true
            ssl_certificate_verification => true
    }
stdout { codec => rubydebug }
}

but I have another server where I am sending filebeat logs to logstash but i am seeing the same kafka logs on filebeat index . here the filbeat logstash config.

input {
        beats {
        port => 5044
        }
   }


output {
        elasticsearch {
        hosts => ["https://elastic-uat.ccilindia.net:9200"]
        user => "elastic"
        password => "Ccil@2023"
        cacert => "/etc/logstash/certs/GeoTrust-RSA-CA-Intermediate-2018.pem"
        index => "gitlab-filebeat-8.7"
        ssl => true
        ssl_certificate_verification => false
        }
stdout
        {
        codec =>rubydebug
        }

}

This is another Logstash server or another Logstash pipeline on the same server?

Logstash server is same but different logstash pipeline

How are you running logstash? As a service? What does your pipelines.yml looks like?

yes as service. pipelines.yml

cat pipelines.yml
# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
#   https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html

- pipeline.id: main
  path.config: "/etc/logstash/conf.d/*.conf"

That's the issue, you are running just one pipeline, Logstash will merge both your pipeline configurations and data from both inputs will be sent to both outputs.

You need to configure pipelines.yml to use multiple pipelines as explained in the documentation.

This way your pipelines will be independent from each other.

even after adding two pipeline id. still i can see the some logs from kafka index. here is pipeline.yml

# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
#   https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html

#- pipeline.id: main
  #  path.config: "/etc/logstash/conf.d/*.conf"
- pipeline.id: kafka
  path.config: "/etc/logstash/conf.d/kafkasre.conf"
- pipeline.id: gitlab
  path.config: "/etc/logstash/conf.d/gitlab.conf"

Are you still seeing new logs from the Kafka input in the beats indice after you restarted Logstash?

Yes, post logstash service restart

That's not correct, so something is still wrong in your configuration.

Please restart your logstash to get fresh logs and share the logs it creates while starting.

Also, run the following request in your logstash server and share the result:

curl http://localhost:9600/_node/pipelines?pretty

I was able to solve the issue by commenting the config path file location in logstash.ynl

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.