Filebeat on local laptop does not talk to the Elasticsearch (also on local laptop) - dial tcp [::1]:9200: connect: cannot assign requested address

Hi,

I am trying to setup Elsticsearch 8, Kibana and Filebeat 8 all on my local laptop, using this guide:

I've got ES and Kibana running in Docker containers and communicating just fine.

Next, I want to send events from Filebeat (which pulls them from a GCP PubSub topic) to ES - and I see the the Filebeat reads events form PubSub just fine, tries to push them into ES, but fails with errors:

{"log.level":"debug","@timestamp":"2022-10-20T17:06:49.805Z","log.logger":"processors","log.origin":{"file.name":"processing/processors.go","file.line":210},"message":"Publish event: {\n  \"@timestamp\": \"2022-10-20T17:06:48.466Z\",\n  \"@metadata\": {\n    \"beat\": \"filebeat\",\n    \"type\": \"_doc\",\n    \"version\": \"8.4.3\",\n    \"_id\": \"m_id_1020_1\"\n  },\n  \"event\": {\n    \"id\": \"59279bf715-5952832523902946\",\n    \"created\": \"2022-10-20T17:06:49.799Z\"\n  },\n  \"message\": {\n    \"request_status\": \"500\",\n    \"cid\": \"12345\",\n    \"remote_ip\": \"165.155.130.139\",\n    \"referer\": \"https://www.my.site2.com/\",\n    \"ref_param\": \"https://www.nyt.com\",\n    \"request_method\": \"POST\",\n    \"response_size\": \"124\",\n    \"activity_date\": \"2022-10-20\",\n    \"user_agent\": \"Mozilla/5.0 (X11; CrOS aarch64 13421.102.0) AppleWebKit/537.36 (KHTML, like Gecko)Chrome/86.0.4240.199 Safari/537.36\",\n    \"event_timestamp_millis\": \"1666285498000\",\n    \"request_size\": \"52\",\n    \"latency\": \"1.3\",\n    \"logstash_id\": \"m_id_1020_1\"\n  },\n  \"input\": {\n    \"type\": \"gcp-pubsub\"\n  },\n  \"host\": {\n    \"containerized\": true,\n    \"ip\": [\n      \"172.17.0.2\"\n    ],\n    \"mac\": [\n      \"02:42:ac:11:00:02\"\n    ],\n    \"hostname\": \"d61a776a9c35\",\n    \"architecture\": \"x86_64\",\n    \"name\": \"d61a776a9c35\",\n    \"os\": {\n      \"kernel\": \"5.10.47-linuxkit\",\n      \"codename\": \"focal\",\n      \"type\": \"linux\",\n      \"platform\": \"ubuntu\",\n      \"version\": \"20.04.5 LTS (Focal Fossa)\",\n      \"family\": \"debian\",\n      \"name\": \"Ubuntu\"\n    }\n  },\n  \"agent\": {\n    \"version\": \"8.4.3\",\n    \"ephemeral_id\": \"0ce2530a-8f5d-4710-ac31-2a64b5b31272\",\n    \"id\": \"c5a8197e-6dc7-4c55-bc43-779ac473f1f1\",\n    \"name\": \"d61a776a9c35\",\n    \"type\": \"filebeat\"\n  },\n  \"ecs\": {\n    \"version\": \"8.0.0\"\n  }\n}","service.name":"filebeat","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2022-10-20T17:06:50.808Z","log.logger":"publisher_pipeline_output","log.origin":{"file.name":"pipeline/client_worker.go","file.line":139},"message":"Connecting to backoff(elasticsearch(https://localhost:9200))","service.name":"filebeat","ecs.version":"1.6.0"}
{"log.level":"debug","@timestamp":"2022-10-20T17:06:50.810Z","log.logger":"esclientleg","log.origin":{"file.name":"eslegclient/connection.go","file.line":267},"message":"ES Ping(url=https://localhost:9200)","service.name":"filebeat","ecs.version":"1.6.0"}
{"log.level":"error","@timestamp":"2022-10-20T17:06:50.813Z","log.logger":"esclientleg","log.origin":{"file.name":"transport/logging.go","file.line":38},"message":"Error dialing dial tcp 127.0.0.1:9200: connect: connection refused","service.name":"filebeat","network":"tcp","address":"localhost:9200","ecs.version":"1.6.0"}
{"log.level":"debug","@timestamp":"2022-10-20T17:06:50.814Z","log.logger":"esclientleg","log.origin":{"file.name":"eslegclient/connection.go","file.line":271},"message":"Ping request failed with: Get \"https://localhost:9200\": dial tcp 127.0.0.1:9200: connect: connection refused","service.name":"filebeat","ecs.version":"1.6.0"}

Here is my full filebeat.yml for reference:

queue.mem:
  events: 4096
  flush.min_events: 2048
  flush.timeout: 1s

# ============================== Filebeat inputs ===============================

filebeat.inputs:
- type: gcp-pubsub
  enabled: true
  project_id: ${PROJECT_ID}
  topic: ${PUBSUB_INPUT_TOPIC}
  subscription.name: ${SUBSCRIPTION_NAME}
  fields_under_root: true


# ======================= Elasticsearch template setting =======================
setup.template.name: "ibc-parsed-logs"
setup.template.pattern: "ibc-parsed-logs-*"
setup.template.json.enabled: true
setup.template.json.path: "ibc_es_template.json"
setup.template.json.name: "ibc-parsed-logs-template"
setup.template.enabled: true
setup.ilm.enabled: false

# ================================== Outputs ===================================

# Configure what output to use when sending the data collected by the beat.
output.console:
  enabled: false
  pretty: true


# ---------------------------- Elasticsearch Output ----------------------------
output.elasticsearch:
  enabled: true
  index: "ibc-parsed-logs"
  parameters.pipeline: "geoip-info"
  hosts: ${ES_HOSTS}
  protocol: "https"
  api_key: ${ES_API_KEY}

# ============================= X-Pack Monitoring ==============================
monitoring.enabled: true
monitoring.cluster_uuid: ${MON_CLUSTER_UUID}

# ================================= Processors =================================
processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - decode_json_fields:
      fields: ["message"]
      add_error_key: true
      document_id: "event_uuid"

# ================================== Logging ===================================
logging.metrics.enabled: true
logging.enabled: true
logging.level: debug
logging.to_files: true
logging.files:
  path: /usr/share/filebeat/f_logs
  name: filebeat
  keepfiles: 10
  permissions: 0640
logging.selectors: ["*"]


I run Filebeat in a Docker container as well, as following:
Dockerfile:

FROM docker.elastic.co/beats/filebeat:8.4.3
COPY filebeat.yml /usr/share/filebeat/filebeat.yml
COPY ibc_es_template.json /usr/share/filebeat/ibc_es_template.json
USER root
RUN chmod +x filebeat.yml

I've built a Docker image with this Dockerfile: filebeat8-local-min:1.0

Docker container run command:

docker run -it --rm  \
 -v /Users/mpopova/.config/gcloud/application_default_credentials.json:/usr/share/filebeat/application_default_credentials.json \
-v "$(pwd)/f_logs:/usr/share/filebeat/f_logs" \
-e GOOGLE_APPLICATION_CREDENTIALS=/usr/share/filebeat/application_default_credentials.json \
 -e PROJECT_ID=my-gcp-pr \
 -e PUBSUB_INPUT_TOPIC=logs-for-es-marina \
 -e SUBSCRIPTION_NAME=logs-for-es-marina-sub \
 -e ES_HOSTS="https://localhost:9200" \
 -e MON_CLUSTER_UUID="qQNb9W_xxx" \
 -e ES_API_KEY="urks_xxx" \
 -e DEBUG_LEVEL=info \
 filebeat8-local-min:1.0

I tried a few variation of the ES_HOSTS variable I pass:
-- http://localhost:9200
-- http://0.0.0.0:9200

same result...
What am I missing?

Thank you!!
Marina

Hi @ppine7

I think we can simplify this Dramatically... if you goal is to simply debug your pipeline (you can come back and make a fully secured / docker implementation another time)

  1. Do not use security yet ... we can come back to that Above will not work even if you get the network straighten out as you have self signed certs etc. The dockers in a separate containers so their localhosts are different etc... etc... This is docker stuff etc... (yes it can be fixed but you are going to spend time)

If you just want to get to debugging your pipeline quickly Here is my Suggestion

  1. Use This Docker Compose, It will set up Elasticsearch and Kibana without security
---
version: '3'
services:
  elasticsearch:
    container_name: es01
    image: docker.elastic.co/elasticsearch/elasticsearch:${TAG}
    # 8.x
    environment: ['CLI_JAVA_OPTS=-Xms1g -Xmx1g','bootstrap.memory_lock=true','discovery.type=single-node','xpack.security.enabled=false', 'xpack.security.enrollment.enabled=false']
    ports:
      - 9200:9200
    networks:
      - elastic
    ulimits:
      memlock:
        soft: -1
        hard: -1
      nofile:
        soft: 65536
        hard: 65536

  kibana:
    image: docker.elastic.co/kibana/kibana:${TAG}
    container_name: kib01
    environment:
      XPACK_APM_SERVICEMAPENABLED: "true"
      XPACK_ENCRYPTEDSAVEDOBJECTS_ENCRYPTIONKEY: d1a66dfd-c4d3-4a0a-8290-2abcb83ab3aa
      LOGGING_ROOT_LEVEL: error


    ports:
      - 5601:5601
    networks:
      - elastic

networks:
  elastic:

then run

TAG=8.4.3 docker-compose -f compose.yml

You will have an elasticsearch and kibana running at http://localhost:9200 and http://localhost:5601 respectively

You can adjust this 'CLI_JAVA_OPTS=-Xms1g -Xmx1g' this if you want to give elasticsearch more resources

  1. Then simply download the tar.gz of filebeat and setup and run from the command line...

This is how I debug 95% of issues...

Then Filebeat we will require nothing but the defaults for connections.

Running Filebeat in a container is all great and dandy but for debug it is painful... from the tar.gz is much easier to debug

Let me know when you get this setup....

great, thanks, Stephen! Will try tonight or the first thing tomorrow morning!
BTW, I did not want to have the full shebang security setup locally - but that's what ES8 forces on you :frowning: I was trying to find a way to just say "disable all security" - as it is ridiculous to setup tokens, sign certs, etc. etc. - just to run all components on your local laptop... but did not find a way.

Glad you showed me one!!!
I also hope Elastic will reconsider this default security mandated everywhere - and we can spin up ES and other services just as easily locally as v7 was!

No Elastic will never go back to no security by default, long discussion there...

BUT if you simply set this in the elasticsearch.yml before initial start then no security.

xpack.security.enabled: false
xpack.security.enrollment.enabled: false

Also I just showed you for simple docker compose so now you have 2 ways.

Those 2 setting disable security... but now it is an action that has to be taken by the user.

Alright, I finally managed to get a shiny new local setup up and running!
Thank you Stephen for your suggestions!

What I ended up doing is:

-- downloaded and installed Elasticseach, Kibana and Filebeat, all of version 8.4.3 , all from gz archives

-- before starting ES for the first time - updated its elasticsearch.yml - added the following settings (2 that Stephen suggested and one more I found online - for good measure :slight_smile: ):

xpack.security.autoconfiguration.enabled: false
xpack.security.enabled: false
xpack.security.enrollment.enabled: false

-- updated kibana.yml - added:

xpack.security.authc.http.enabled: false

-- started ES, then Kibana - verified then can connect and I can see ES monitoring data in Kibana (after enabling self-monitoring in Kibana)

-- updated my existing filebeat.yml - to connect to the local ES, and still read input data from GCP PubSub - and verified that events I publish in PubSub and correctly pushed by Filebeat into ES ! :slight_smile:

Thank you, @stephenb ! Now I have a working local setup I can easily play with!
... and now I can return to the GEO IP pipeline setup in a separate post

1 Like

Did you turn on the filebeat monitoring I thought that was one of your variables....

yes, I turned on monitoring for Filebeat too - here is my filebeat.yml that works well locally:

###################### Filebeat Configuration Example #########################
queue.mem:
  events: 4096
  flush.min_events: 2048
  flush.timeout: 1s

# ============================== Filebeat inputs ===============================
filebeat.inputs:
- type: gcp-pubsub
  enabled: true
  project_id: ${PROJECT_ID}
  topic: ${PUBSUB_INPUT_TOPIC}
  subscription.name: ${SUBSCRIPTION_NAME}
  fields_under_root: true

# ======================= Elasticsearch template setting =======================
setup.template.name: "ibc-parsed-logs"
setup.template.pattern: "ibc-parsed-logs-*"
setup.template.json.enabled: true
setup.template.json.path: "ibc_es_template.json"
setup.template.json.name: "ibc-parsed-logs-template"
setup.template.enabled: true
setup.ilm.enabled: false

# ================================== Outputs ===================================
output.console:
  enabled: false
  pretty: true

# ---------------------------- Elasticsearch Output ----------------------------
output.elasticsearch:
  enabled: true
  index: "ibc-parsed-logs"
  parameters.pipeline: "geoip-info"
  #hosts: ${ES_HOSTS}
  hosts: "http://localhost:9200"
  #protocol: "https"
  #api_key: ${ES_API_KEY}

# ============================= X-Pack Monitoring ==============================
monitoring.enabled: true
monitoring.cluster_uuid: ${MON_CLUSTER_UUID}

# ================================= Processors =================================
processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - decode_json_fields:
      fields: ["message"]
      add_error_key: true
      document_id: "event_uuid"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.