Trying to setup logstash with a github repo

I'm using this github repo *as start point. Started elasticsearch,kibana from a docker-compose file and currently trying to add logstash -f logstash.conf to work in command prompt but reaches a error message.

D:\Dev\Django\django-logs-filebeat\.docker\logstash\pipeline>logstash -f logstash.conf
to start the logstash app. Which result in the following warning message.

I added "logstash" to windows path using a local logstash version downloaded from elastic.co
Download Logstash Free | Get Started Now | Elastic | Elastic

[2021-04-08T03:59:37,184][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://elasticsearch:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::ResolutionFailure] elasticsearch"}

It uses Dockerfile and docker-compose.yml to run the elk stack.

I'm using docker-compose up -d to start the dockerfile

    D:\Dev\Django\django-logs-filebeat>docker-compose up -d
    Starting django-logs-filebeat_api_1             ... done
    Recreating django-logs-filebeat_elasticsearch_1 ... done
    Starting django-logs-filebeat_nginx_1           ... done
    Starting django-logs-filebeat_filebeat_1        ... done
    django-logs-filebeat_logstash_1 is up-to-date
    django-logs-filebeat_kibana_1 is up-to-date

Then i can reach localhost:9200 for a short description and localhost:5601 for the elasticsearch and kibana. This message below is shown at localhost:9200

    {
      "name" : "69f1328d6e84",
      "cluster_name" : "docker-cluster",
      "cluster_uuid" : "0D6-1TddRlSuP2tEH332SA",
      "version" : {
        "number" : "7.9.1",
        "build_flavor" : "default",
        "build_type" : "docker",
        "build_hash" : "083627f112ba94dffc1232e8b42b73492789ef91",
        "build_date" : "2020-09-01T21:22:21.964974Z",
        "build_snapshot" : false,
        "lucene_version" : "8.6.2",
        "minimum_wire_compatibility_version" : "6.8.0",
        "minimum_index_compatibility_version" : "6.0.0-beta1"
      },
      "tagline" : "You Know, for Search"
    }

It's because logstash is running outside of docker so the hostname elasticsearch doesn't resolve to anything. You need to use localhost or 127.0.0.1 in ur logstash config. The error is telling u that it can't connect to elasticsearch.

1 Like

Hi, that seems to done the trick

[2021-04-08T04:26:24,944][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-04-08T04:26:25,376][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2021-04-08T04:26:25,934][INFO ][org.reflections.Reflections] Reflections took 23 ms to scan 1 urls, producing 23 keys and 47 values
[2021-04-08T04:26:26,381][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2021-04-08T04:26:26,468][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}

However, nothing shows up in the kibana section -> index patterns as I thought it would ...
I'm using django and make a log file inside logs/app.log

Edit:

Could it be i point logstash to the wrong version ( local one ) as it seems logstash is runned in docker-compose.yml ?

D:\Dev\Django\django-logs-filebeat>docker-compose up -d
Creating network "django-logs-filebeat_default" with the default driver
Creating django-logs-filebeat_elasticsearch_1 ... done
Creating django-logs-filebeat_api_1           ... done
Creating django-logs-filebeat_filebeat_1      ... done
Creating django-logs-filebeat_nginx_1         ... done
Creating django-logs-filebeat_logstash_1      ... done
Creating django-logs-filebeat_kibana_1        ... done

You solved the localhost issue. so thanks legoguy. Do you have any inputs why my log file data doesn't show up in kibana? I could give you solution and start up a new question to get help with kibana logging part . What do you think is the best way to go?

So whether or not u use the native of docker version of logstash is up to u. As for the index pattern not showing up, logstash doesn't create index patterns, you'll either need to do it via the kibana interface or if ur sending data that matches the filebeat mapping have filebeat generate the initial setup.

1 Like

Oh, can you give me some help.. I have a log file in ./src/logs/app.log

i want to add that to kibana, or logstash :slight_smile:
I think ... the proj is sending data that should match a filebeat mapping .

I'm not sure how to read the docker-compose - but it seems filebeat is setup there. and connected to logstash

LOG FILE CONTENT

{"message": "Watching for file changes with StatReloader", "timestamp": "2021-04-07T19:06:26.508223+00:00", "level": "INFO", "context": {}, "django": {"app": "elk_demo", "name": "django.utils.autoreload", "filename": "autoreload.py", "funcName": "run_with_reloader", "msecs": 506.2224864959717}}
{"message": "This is debug message", "timestamp": "2021-04-07T19:09:29.987125+00:00", "level": "DEBUG", "context": {"extraParam": "Gonzalo"}, "django": {"app": "elk_demo", "name": "app.views", "filename": "views.py", "funcName": "get", "msecs": 987.1249198913574}}
{"message": "This is debug message", "timestamp": "2021-04-07T19:09:52.657151+00:00", "level": "DEBUG", "context": {"extraParam": "Gonzalo"}, "django": {"app": "elk_demo", "name": "app.views", "filename": "views.py", "funcName": "get", "msecs": 657.1516990661621}}
{"message": "Using proactor: IocpProactor", "timestamp": "2021-04-07T19:17:36.519503+00:00", "level": "DEBUG", "context": {}, "django": {"app": "elk_demo", "name": "asyncio", "filename": "proactor_events.py", "funcName": "__init__", "msecs": 519.5038318634033}}