Connecting filebeat to elastic cloud

Hi, I would i configure the output of this filebeat.yml file to send logs to elastic cloud

filebeat.inputs:
  - enabled: true
    paths:
      - /data/fatt/log/fatt.log
    fields:
      type: Fatt
    fields_under_root: true
  
  - enabled: true
    paths:
      - /data/suricata/log/eve.json
    fields:
      type: Suricata
    fields_under_root: true
    json.keys_under_root: true
    json.overwrite_keys: true 
  
  - enabled: true
    paths:
      - /data/p0f/log/p0f.json
    fields:
      type: P0f
    fields_under_root: true
    json.keys_under_root: true
    json.overwrite_keys: true 
  
  - enabled: true
    paths:
      - /data/adbhoney/log/adbhoney.json
    fields:
      type: Adbhoney
    fields_under_root: true
    json.keys_under_root: true
    json.overwrite_keys: true 
  
  - enabled: true
    paths:
      - /data/ciscoasa/log/ciscoasa.log
    fields:
      type: Ciscoasa
    fields_under_root: true

  - enabled: true
    paths:
      - /data/citrixhoneypot/logs/server.log
    fields:
      type: CitrixHoneypot
    fields_under_root: true

  - enabled: true
    paths:
      - /data/conpot/log/*.json
    fields:
      type: ConPot
    fields_under_root: true
    json.keys_under_root: true
    json.overwrite_keys: true 
  
  - enabled: true
    paths:
      - /data/cowrie/log/cowrie.json
    fields:
      type: Cowrie
    fields_under_root: true
    json.keys_under_root: true
    json.overwrite_keys: true 
  
  - enabled: true
    paths:
      - /data/dionaea/log/dionaea.json
    fields:
      type: Dionaea
    fields_under_root: true
    json.keys_under_root: true
    json.overwrite_keys: true 
  
  - enabled: true
    paths:
      - /data/dicompot/log/dicompot.log
    fields:
      type: Dicompot
    fields_under_root: true

  - enabled: true
    paths:
      - /data/elasticpot/log/elasticpot.json
    fields:
      type: ElasticPot
    fields_under_root: true
    json.keys_under_root: true
    json.overwrite_keys: true 

  - enabled: true
    paths:
      - /data/glutton/log/glutton.log
    fields:
      type: Glutton
    fields_under_root: true

  - enabled: true
    paths:
      - /data/heralding/log/auth.csv
    fields:
      type: Heralding
    fields_under_root: true

  - enabled: true
    paths:
      - /data/honeypy/log/json.log
    fields:
      type: Honeypy
    fields_under_root: true

  - enabled: true
    paths:
      - /data/honeysap/log/honeysap-external.log
    fields:
      type: Honeysap
    fields_under_root: true

  - enabled: true
    paths:
      - /data/honeytrap/log/attackers.json
    fields:
      type: Honeytrap
    fields_under_root: true
    json.keys_under_root: true
    json.overwrite_keys: true 

  - enabled: true
    paths:
      - /data/ipphoney/log/ipphoney.json
    fields:
      type: Ipphoney
    fields_under_root: true
    json.keys_under_root: true
    json.overwrite_keys: true 

  - enabled: true
    paths:
      - /data/mailoney/log/commands.log
    fields:
      type: Mailoney
    fields_under_root: true

  - enabled: true
    paths:
      - /data/medpot/log/medpot.log
    fields:
      type: Medpot
    fields_under_root: true

  - enabled: true
    paths:
      - /data/rdpy/log/rdpy.log
    fields:
      type: Rdpy
    fields_under_root: true

  - enabled: true
    paths:
      - /data/nginx/log/access.log
    fields:
      type: NGINX
    fields_under_root: true

  - enabled: true
    paths:
      - /data/tanner/log/tanner_report.json
    fields:
      type: Tanner
    fields_under_root: true
    json.keys_under_root: true
    json.overwrite_keys: true 

processors:
  - drop_fields:
      fields: ["agent", "ecs", "log", "host", "@metadata"]
  - add_fields:
      target: host_info
      fields:
        exit_ip: "${MY_EXTIP}"
        in_ip: "${MY_INTIP}"
        hostname: "${MY_HOSTNAME}"

output.logstash:
  hosts: ["<YOUR_CENTRAL_LOGSTASH_IP>:64299"]

hi @Richard_Phillips_Roy ,

In the configuration above, I see you are sending logs to logstash instead. You will have to replace that with one of the following steps:
Configure the Elasticsearch output | Filebeat Reference [7.12] | Elastic
Configure the output for Elasticsearch Service on Elastic Cloud | Filebeat Reference [7.12] | Elastic

Hi @MarianaD,

I tried it like this

output.elasticsearch:
  hosts: ["https://i-o-optimized-deployment-b72df9.es.us-west1.gcp.cloud.es.io:9243"]
  username: "elastic"
  password: "F9e0fLuPJHz40FuosDFMFfjF"

but I'm getting an error like this, what am I doing wrong? (I'm using my Elastic cloud credentials)

● filebeat.service - Filebeat sends log files to Logstash or directly to Elasticsearch.
   Loaded: loaded (/lib/systemd/system/filebeat.service; enabled; vendor preset: enabled)
   Active: failed (Result: exit-code) since Fri 2021-04-23 01:50:07 UTC; 3min 16s ago
     Docs: https://www.elastic.co/products/beats/filebeat
  Process: 5970 ExecStart=/usr/share/filebeat/bin/filebeat --environment systemd $BEAT_LOG_OPTS $BEAT_CONFIG_OPTS $BEAT_PATH_OPTS (code=exited, status=1/FAILURE)
 Main PID: 5970 (code=exited, status=1/FAILURE)

Apr 23 01:50:07 specificdoe systemd[1]: filebeat.service: Service RestartSec=100ms expired, scheduling restart.
Apr 23 01:50:07 specificdoe systemd[1]: filebeat.service: Scheduled restart job, restart counter is at 5.
Apr 23 01:50:07 specificdoe systemd[1]: Stopped Filebeat sends log files to Logstash or directly to Elasticsearch..
Apr 23 01:50:07 specificdoe systemd[1]: filebeat.service: Start request repeated too quickly.
Apr 23 01:50:07 specificdoe systemd[1]: filebeat.service: Failed with result 'exit-code'.
Apr 23 01:50:07 specificdoe systemd[1]: Failed to start Filebeat sends log files to Logstash or directly to Elasticsearch..

hi @Richard_Phillips_Roy , did you remove the output.logstash configuration from your file? If still not, can you do that and retry?

Hi @MarianaD , I did But I'm facing the same issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.