Filebeat to logstash multiple indexs

i'm new to ELK was trying to configure filebeat on multiple instances. i have different types of logs.

  1. kafka logs
  2. zookeeper logs
  3. hdfs logs
  4. yarn logs
    .
    .
    .
    on all these instances i was trying to configure file beat and from around 50 filebeats i was sending logs to single logstash. So i was wondering how can i differentiate all this logs. do it works with multple index names or filers on logstash or something else??

So i was wondering how can i differentiate all this logs.

You can use conditionals in the Logstash configuration in order to do different things for different kinds of events.

do it works with multple index names or filers on logstash or something else??

You can send different kinds of events to different indexes, yes.

Your question is very open so it's hard to be specific.

if possible can you please send basic syntax for different indexs with single logstash with multiple filebeat

I was sending logs from multiple filebeats to single logstash and then to elastic search.

filebeats(count 20) ---> logstash(1) --> elasticsearch(1) ---> kibana

does it works the example you send to me was directly from filebeat to elastic search.

does it works the example you send to me was directly from filebeat to Elasticsearch.

The example I gave had nothing at all to do with Filebeat.

ok got you . filebeat just push the data , logstash will do all filter. thanks.

can i get official repo for logstash and filebeat.

I was trying to configure logstash on my instance
i was getting facing issue.
i dont know if i was doing anything wrong
I tried on both ubuntu & centos i was facing same issue with 5.0.1 but 2.4 works for me.I wan't to use latest version.
This repo i was trying to use.

https://www.elastic.co/guide/en/logstash/current/installing-logstash.html

[root@ip-****** ]# service logstash status**
logstash: unrecognized service

I was doing these setps to configure logstash
rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch
Add the following in your /etc/yum.repos.d/ directory in a file with a .repo suffix, for example logstash.repo

[logstash-5.x]
name=Elastic repository for 5.x packages
baseurl=https://artifacts.elastic.co/packages/5.x/yum
gpgcheck=1
gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch
enabled=1
autorefresh=1
type=rpm-md
And your repository is ready for use. You can install it with:

sudo yum install logstash

starting agent {:level=>:info, :file=>"logstash/agent.rb", :line=>"207", :method=>"execute"}
starting pipeline {:id=>"main", :level=>:info, :file=>"logstash/agent.rb", :line=>"469", :method=>"start_pipeline"}
Settings: Default pipeline workers: 1
Beats inputs: Starting input listener {:address=>"0.0.0.0:5044", :level=>:info, :file=>"logstash/inputs/beats.rb", :line=>"111", :method=>"register"}
Beats inputs: Starting input listener {:address=>"0.0.0.0:5044", :level=>:info, :file=>"logstash/inputs/beats.rb", :line=>"111", :method=>"register"}
Pipeline aborted due to error {:exception=>#<Errno::EADDRINUSE: Address already in use - bind - Address already in use>, :backtrace=>["org/jruby/ext/socket/RubyTCPServer.java:118:in initialize'", "org/jruby/RubyIO.java:853:innew'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.2.9/lib/lumberjack/beats/server.rb:51:in initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-beats-2.2.9/lib/logstash/inputs/beats.rb:119:inregister'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:330:in start_inputs'", "org/jruby/RubyArray.java:1613:ineach'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:329:in start_inputs'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:180:instart_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:136:in run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/agent.rb:473:instart_pipeline'"], :level=>:error, :file=>"logstash/agent.rb", :line=>"475", :method=>"start_pipeline"}
stopping pipeline {:id=>"main", :file=>"logstash/agent.rb", :line=>"388", :method=>"shutdown_pipelines"}

Can you pleas help me with this error !!!
Thanks.

It looks like you've defined to beats inputs, both trying to listen on the same port. Keep in mind that Logstash reads all files in /etc/logstash/conf.d. Perhaps you have a left-over backup file or similar?

My question was,
I know i can send 10 kafka logs using beats to single logstash, but my question was, if i have for example 10 kafka (beats) , 5 zookeepers(beats), 4 spark(beats) ......
can i send all different types to single logstash with different indexs ??

input {
beats {
type => "kafka"
port => "5044"
}
beats {
type => "zookeeper"
port => "5044"
}

}

output {
if [type] == "kafka" {
elasticsearch {
action => "index"
hosts => "elasticsearchip:80"
index => "kafkalogs"
}
stdout { codec => rubydebug }
} else {
elasticsearch {
action => "index"
hosts => "elasticsearchip:80"
index => "zookeeperlogs"
}
stdout { codec => rubydebug }
}
}

can i send all different types to single logstash with different indexs ??

Yes, but you obviously can't have multiple beats listeners using the same port. Either use multiple ports or use a single listener and use some other method to distinguish between different kind of events.

if possible can you please provide me sample syntax !! can you please tell me some other ports than 5044, i will try those. [quote="magnusbaeck, post:12, topic:67399"]
Yes, but you obviously can't have multiple beats listeners using the same port. Either use multiple ports or use a single listener and use some other method to distinguish between different kind of events.
[/quote]

if possible can you please provide me sample syntax !!

An example of what?

can you please tell me some other ports than 5044, i will try those.

You can use any port that's available on your machine. Try 5045, for example.

i was not sure where i was wrong, was creating index only for zookeeper logs but not for kafka logs.
can you please correct syntax if i was doing anything wrong
input {
beats {
type => "kafka"
port => "5044"
}
beats {
type => "zookeeper"
port => "5045"
}

}

output {
if [type] == "kafka" {
elasticsearch {
action => "index"
hosts => "elasticsearchip:80"
index => "kafkalogs"
}
stdout { codec => rubydebug }
} else {
elasticsearch {
action => "index"
hosts => "elasticsearchip:80"
index => "zookeeperlogs"
}
stdout { codec => rubydebug }
}
}

That looks correct. The Logstash logs should contain more clues about what's going on.

i was not sure where it was getting wrong, this error i was getting

Reading config file {:config_file=>"/etc/logstash/conf.d/logstash.conf", :level=>:debug, :file=>"logstash/config/loader.rb", :line=>"69", :method=>"local_config"}
fetched an invalid config {:config=>"input {\n beats {\n port => 5044\n tags => [ "bh-test" ]\n }\nfilter{\nif [type] == "clouda" {\n grok {\n match => [ { "message" => "%{cloud-init}" } ]\n }\n }\n\n}\noutput {\nif "bh-test" in [tags] {\n elasticsearch {\n action => "index"\n hosts => "endpoint:80"\n index => "w-test"\n }\n }\n}\n\n", :reason=>"Expected one of #, => at line 7, column 4 (byte 82) after input {\n beats {\n port => 5044\n tags => [ "bh-test" ]\n }\nfilter{\nif ", :level=>:error, :file=>"logstash/agent.rb", :line=>"430", :method=>"create_pipeline"}
starting agent {:level=>:info, :file=>"logstash/agent.rb", :line=>"207", :method=>"execute"}

filebeat::

  • input_type: log
    paths:
    • /var/log/cloud-init.log
      document_type: clouda
      tags: ["bhtest"]

logstash::
input {
beats {
port => 5044
tags => [ "bhtest" ]
}
filter{
if [type] == "clouda" {
grok {
match => [ { "message" => "%{cloud-init}" } ]
}
}

}
output {
if "bhtest" in [tags] {
elasticsearch {
action => "index"
hosts => "ip:80"
index => "w-test"
}
}
}

match => [ { "message" => "%{cloud-init}" } ]

Change to:

 match => { "message" => "%{cloud-init}" }

This might not be what Logstash is complaining about. I can't spot what it otherwise could be though. Comment out blocks to narrow things down and consider running the config file through e.g. hexdump to make sure you don't have any invisible garbage characters.

I was trying something like this, but was not able to see any logs & index in kibana.
my main problem was trying to get different indexs for different files i have
please help me with this issue.

filebeat:
prospectors:
- paths:
- /var/log/redis/*.log
document_type: redis

- paths:
    - /var/log/python/*.log
  document_type: python

- paths:
    - /var/log/mongodb/*.log
  document_type: mongodb

input {
beats {
port => 5044
}
}

output {

Customize elasticsearch output for Filebeat.

if [@metadata][beat] == "filebeat" {
elasticsearch {
hosts => "localhost:9200"
manage_template => false
# Use the Filebeat document_type value for the Elasticsearch index name.
index => "%{[@metadata][type]}-%{+YYYY.MM.dd}"
document_type => "log"
}
}
}

i was looking for this , but forward this to logstash and from there to elasticsearch with multiple index