"Hi, I am very new to ELK stack, kindly support me.
I am currently configuring elasticsearch, Logstash, beats and kibana v7.7.1 to collect logs from windows server, Linux server, network devices and application logs in other formats like csv and store in Elasticsearch with different indices.
Currently ELK stack is deployed and running on the same server.
Some of the challenges I'm facing include:
-
If I run filebeat service on a Linux server where I intend to collect Linux logs and application log files in csv , the error generated says no output defined.
-
How do I collect different data types such as log, csv and syslog located in the same Linux server (different paths such as /var/log/.log and /home/user1/foldername/.csv) using logstash and beats with different indices defined in the Elasticsearch output. In this folder "/etc/logstash/conf.d" I have defined inputs, filters and outputs in separate configuration files.
-
If I check logstash log file I also see this error "[2020-06-29T18:47:06,324][WARN ][logstash.outputs.elasticsearch][main][516a047366b4f50cff4206391d217db002445555669ec17f017f0f207a8f7b73] Could not index event to Elasticsearch. "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:79"
-
I've chosen to ingest data in elasticsearch using logstash; is it possible to have a single input file which defines different data types, different filters and single output file with set of different indices as per data types-log, syslog, csv?
-
Also, as much as I need to collect winlogbeat, filebeat data using logstash with different indices, I also need to collect syslog from network devices (switches/routers) with a separate index
Thank you in advance