Exiting: Error reading config file: required 'object', but found 'string' in field 'filebeat.inputs.0' (source:'filebeat.yml')

HI Guys i have created the below sample filebeat.yml and verified through yamalint still i am getting the same error . I am trying to setup filebeat to work with elastic and the filebeat itself wont start up giving the error .

My config file is

###################### Filebeat Configuration Example #########################

# This file is an example configuration file highlighting only the most common
# options. The filebeat.reference.yml file from the same directory contains all the
# supported options with more comments. You can use it as a reference.
#
# You can find the full configuration reference here:
# https://www.elastic.co/guide/en/beats/filebeat/index.html

# For more available modules and options, please see the filebeat.reference.yml sample
# configuration file.

# ============================== Filebeat inputs ===============================

filebeat.inputs:
  - type:log document_type:Elasticlogs enabled:true paths:-
    D:\Elastic\elasticsearch-8.8.2-windows-x86_64\elasticsearch-8.8.2\logs
# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: D:\Elastic\filebeat-8.8.2-windows-x86_64\filebeat-8.8.2-windows-x86_64\modules.d\*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

# ======================= Elasticsearch template setting =======================

setup.template.settings:
  index.number_of_shards: 1
  #index.codec: best_compression
  #_source.enabled: false


# ================================== General ===================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:

# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]

# Optional fields that you can specify to add additional information to the
# output.
#fields:
#  env: staging

Please help me

Hi @Bhakti_Bhabal welcome to the community.

I formatted your code for you please.do this in the future otherwise we can not help with syntax errors.

You can format your code with 3 Backticks ``` the line before and after the code or select the code and pressing the </> if you click the pencil edit icon on your first post you will see what I did.

What version are you using

This looks malformed

i am using filebeat8.8.2

Thanks... that code is still malformed.
A couple Things you should use the filestream input as the log input is deprecated
2nd there is no document_type setting
3rd there are many syntax issue perhaps they are all cut-n-past

filebeat.inputs:
  - type:log <!--- No Space after :
    document_type:Elasticlogs <!---- Not Valid, no space after :
    enabled:true<!--- No Space after :
    paths: 
      - D:\Elastic\elasticsearch-8.8.2-windows-x86_64\elasticsearch-8.8.2\logs

Should look something like this follow the example from the docs

filebeat.inputs:
  - type: filestream 
    id: my-filestream-id
    enabled: true
    paths: 
      - D:\Elastic\elasticsearch-8.8.2-windows-x86_64\elasticsearch-8.8.2\logs

Thanks a lot Stephen it worked , i have one question , when i insert data into Elastic kibana takes the timestamp of logs inserted but not the actual timings mentioned in logs . How can i get the timestamp printed in message and created a data view

You will need to create an ingest pipeline, parse your message etc and then set that as the timestamp that is the common flow.

Open a new topic with a sample with your log, the type / source of the log and ask for help parsing your log and setting the timestamp

If it is a common type there may be a built-in module / parser

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.