Help in using logstash

Hi all,

Currently, i am using filebeat and metricbeat to get logs from kubernetes into the elastic cloud. I wish to test this using logstash now. Can somebody guide me how to pull the data? I am aware of setting up the pipeline config in cloud console. But would like to know if I need to change the output setting such as, filebeat -> logstash -> cloud in config file.

Alternatively, is there any way to use logstash alone to get input from kubernetes directly?

I would like to find the best way to log the data hence testing all the available features.

Thanks in advance,

Cheers,
Kanthi.

To fine-tune more,

I assume that in pipeline section provided in cloud console, we should enter the input and output required. Then from terminal execute the pipeline using the following command,

bin/logstash -f ###.conf

But not sure how to redirect filebeat to use logstash and how to execute it from there

If I am using this pipeline config can I fetch the logs?

input {
  beats {
        port => "5044"
        host => "https://####.ap-southeast-2.aws.found.io:9243"
    }
}
filter {
  if [type] == "kube-logs" {

    mutate {
      rename => ["log", "message"]
    }

    date {
      match => ["time", "ISO8601"]
      remove_field => ["time"]
    }

    grok {
        match => { "source" => "/var/log/containers/%{DATA:pod_name}_%{DATA:namespace}_%{GREEDYDATA:container_name}-%{DATA:container_id}.log" }
        remove_field => ["source"]
    }
  }
}
output {
elasticsearch {
    hosts => "https://####.ap-southeast-2.aws.found.io:9243"
    manage_template => false
    index => "%{[@metadata][filebeat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

Trying to get input from file filebeat. wanted to confirm if the host address provided in the input is the elastic search address? and in the output part of the index is the generalized format correct? or should we specify the absolute index name?
Also, the pipleline is saved in cloud. So how to proceed it from there?

The host in the input section is the host that has filebeat installed on it to ship logs. You need to make sure filebeat is pointed at your logstash server and logstash is listening for that server.

@mitchdavis Ok in that case, can I install logstash in elastic cloud and point out the elastic search host address? and since I configure the pipeline there how can I stash my first event?

Thanks.

The host in the input section is the host that has filebeat installed on it to ship logs.

No it's not. It's the address or hostname of the local network interface on which to listen for new connections.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.