Create multiple Index for different steams of data

Hi ,
I am very new on setting up log stash, but have been using Kibana on user level of createing dashboard and analysis using basic templates

My input logs are all JSON types, this is the logstash config which works presently

input
{
beats
{
port => "5044"
codec => "json"
include_codec_tag => false
}
}

output
{
elasticsearch
{
hosts => "localhost:9200"
index => "fourindex"
}
}

the filebeat setting is here

  • type: log

    Change to true to enable this prospector configuration.

    enabled: true

    Paths that should be crawled and fetched. Glob based paths.

    paths:

    • C:\ProgramData\filebeat\logs\atg\Rev4*.json

    close_removed: true
    clean_removed: true


Question (1) How to add another stream of data placed in different paths example : C:\ProgramData\filebeat\logs\NewData*.json and what is the corresponding change in the logstash config so that its created as a separate index in the kibana

Hi @leinad,

Input:
You can use file input plugin for reading the logs from this path.
C:\ProgramData\filebeat\logs\NewData.json*

Output:
Use else, if conditions
Example:

else [path] == "NewData" {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "host:9200" ]
index => "new-index"
}
}

Thank @tahseen_fatima I am new so I trying to figure it out step by step,

hi @tahseen_fatima this is the code I used and got the error below

---------------SETTING IN FILEBEAT---------------------

  • type: log

--- # Change to true to enable this input configuration.
enabled: true

-# Paths that should be crawled and fetched. Glob based paths.
paths:

- /var/log/*.log

#- C:\ProgramData\filebeat\logs\atg\Rev4*.json

  • D:\Logs\Rev4*.json
  • D:\Logs\Rev5*.json
    close_removed: true

--------------------logstash config CODE USED----------------------------
input
{
beats
{
port => "5044"
codec => "json"
include_codec_tag => false
}
}

output
{
if [path] == "D:\Logs\Rev4"{
stdout { codec => rubydebug }
elasticsearch
{
hosts => "localhost:9200"
index => "Rev4"
}
else if [path] == "D:\Logs\Rev5"{
stdout { codec => rubydebug }
{
hosts => "localhost:9200"
index => "Rev5"
}
}

-------ERROR AT LOGSTASHOUTPUT---------------------
[2021-01-25T09:53:02,106][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2021-01-25T09:53:02,473][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \t\r\n], "#", "and", "or", "xor", "nand", "{" at line 17, column 13 (byte 196) after output \n{\n\tif [path] == "D:\Logs\Rev4\"{\nstdout { codec => rubydebug }\n\telasticsearch \n\t{ \n\t\thosts => "", :backtrace=>["D:/Kibana/logstash-7.10.1/logstash-core/lib/logstash/compiler.rb:32:in compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:184:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:69:in initialize'", "D:/Kibana/logstash-7.10.1/logstash-core/lib/logstash/java_pipeline.rb:47:in initialize'", "D:/Kibana/logstash-7.10.1/logstash-core/lib/logstash/pipeline_action/create.rb:52:in execute'", "D:/Kibana/logstash-7.10.1/logstash-core/lib/logstash/agent.rb:365:in block in converge_state'"]}
[2021-01-25T09:53:02,517][DEBUG][logstash.instrument.periodicpoller.os] Stopping

Hi @leinad

I can see there is an error at line 17, column 13.

From your error message I can see the brackets are not closed properly in the output.

output
{
if [path] == "D:\Logs\Rev4"{
stdout { codec => rubydebug }
elasticsearch
{
hosts => "localhost:9200"
index => "Rev4"
}
} # you forgot to close this elasticsearch bracket here
else if [path] == "D:\Logs\Rev5"{
stdout { codec => rubydebug }
{
hosts => "localhost:9200"
index => "Rev5"
}
}
} # same mistake here

Regards,
Tahseen

Hi @tahseen_fatima Thanks for the corrections, i am able to run the config successfully, but i did no see the index created in the elastic search, not sure where to start the debug, i am able to create index without using any path separation pls advise

Hi @leinad

Insted of path separation take a unique key word from your logs. For eg you can also take host or any filed name or field value but it should be unique.

if [host] == "10.5.----"

Regards,
Tahseen

Hi @tahseen_fatima Thanks for your assistance I am stuck with some mappings, I am trying to create a table from array values,
Basically I have two arrays
array1[str1,str2,str,3]
array [val1,val2,val3]
how to display it in two columns and each row of the table is
str1,val1
str2,val2

Hi @leinad

I believe your logstash issue has been resolved, so could you open a separate post for this so people have an easier time finding the topic later on.

Regards,
Tahseen

Hi @tahseen_fatima Yes, I did opened a new post as well thanks