How to Mongodb logs to elasticsearch through logstash

Hi There,

How do I send mongodb logs into elasticsearch.

While MongoDB is on a VM and logstash on other different server, what is the best way to ingest logs into Elasticsearch

Thanks
Kishore

You might need to use this approach:

Hi There,

Thanks for the reply, can you please explain me little as I am a newbie to ELK?
My primary aim is to read the mongodb logs on one server to be shipped to elastic search and trigger alerts based on events such as startup, shutdown, etc

What is the best method to approach?
Also please let me know where do I get the plugins though I could find them in logstash-plugin.bat list

I could able to ingest logs from mongo by following 2 steps

  1. Change the URL from localhost to

Update the following parameters in elasticsearch.yml file

network.host:

http.bind_host:
http.publish_host:
http.host:

Bounce Elastic Search

  1. Ingest logs from mongodb into Elastic search

Install filebeat on the mongodb server
Update filebeat.yml with following variables

#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:

Array of hosts to connect to.

hosts: [":9200"]

filebeat.inputs:

  • type: log

    Change to true to enable this input configuration.

    enabled: true

    Paths that should be crawled and fetched. Glob based paths.

    paths:

    • /db/mnglog/elkpoc/elkpoc*.log
  1. Start filebeat using the below command
    ./filebeat -e -v -c filebeat.yml

You'll start seeing the message
2020-02-03T22:33:07.232-0800 INFO pipeline/output.go:105 Connection to backoff(elasticsearch(http://:9200)) established

Thanks!!

I could able to ingest the logs directly from webserver where logs are hosted using filebeat to elasticsearch. I wanted to send the logs to elasticsearch through logstash.
How do I do?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.