Response data from text file multiple folder to multiple index

My logstash configuration is this

input { beats { port => 5044 } } filter { xml { store_xml => "false" source => "message"
xpath =>["/propertyAvailability/hotelRates/hotel[2]/bookingChannel[4]/ratePlan[3]/miscInfo/text()","hhBedType"] } }
output { elasticsearch { hosts => ["http://localhost:9200"] index => "data1" } }
My filebeat configuration is this

filebeat.inputs:

  • type: log
    enabled: true
    paths: - D:\HotelHub_ELK\Data1*.txt (edited)

Now data is in data1 folder and moving to data1 index
What we need is we will store data in 4 folder 1)data1 2)data2 3)data3 4)data4 all four folder data should move to four different index

AFAIK events from filebeat should have a source field with the original path. So you could use a grok pattern to extract the folder name from that, save this information in [@metadata][logfolder] and then set the index setting in the ES output to %{[@metadata][logfolder]}. Then you could just set the path for filebeat to D:\HotelHub_ELK\Data*.txt to read all folders starting with Data.

Another option would be to have one filebeat input for every folder and add different fields to them.

1 Like

filebeat.inputs:

  • type: log

    enabled: true

    paths:

    • D:\HotelHub\ATPI\OHHPayload\Payload**.txt

filebeat.config.modules:

path: ${path.config}/modules.d/*.yml

reload.enabled: false

setup.template.settings:
index.number_of_shards: 1

setup.kibana:

host: "192.168.100.136:5601"

output.logstash:

hosts: ["192.168.100.136:5044"]

processors:

  • add_host_metadata: ~
  • add_cloud_metadata: ~
  • add_docker_metadata: ~
  • add_kubernetes_metadata: ~

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.