Data field in kibana discovery getting overlapped

Hi Team,

I have created logstash pipeline using 2 csv files and getting output also in kibana. But whenever I am viewing data in kibana , output fields are getting overlapped. Please review below configuration files and help me out. 

[root@xxxx logstash]# cat /etc/logstash/pipelines.yml
# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
#   https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html

#- pipeline.id: main
#  path.config: "/etc/logstash/conf.d/*.conf"

- pipeline.id: my-first-pipeline
  path.config: /etc/logstash/aws-cost.conf
  pipeline.worker: 2

`*******************`

[root@xxxxxx logstash]# cat aws-cost.conf
input {
    file {
       path => "/home/xxxxxx/EBSInventory.csv"
       start_position => beginning
       sincedb_path => "/dev/null"
    }
}
filter {
    csv {
        separator => ","
        columns => ["Volume Name","Volume ID","Volume type","Size","IOPS","Instance ID","Block device","AvailabilityZone","encrypted","State"]

    }
}

output {
   xxxxxxsearch {
      hosts => "http://xxxxxx:9200"
      index => "ebs-data"
user => "xxxxxx"
password => "xxxxxx"
   }
   stdout {}

}

input {
    file {
       path => "/home/xxxxxx/MAXCPU.csv"
       start_position => beginning
       sincedb_path => "/dev/null"
    }
}

filter {
    csv {
        separator => ","
        columns => ["InstanceName","InstanceID","MAXCPU%"]

    }
}

output {
   xxxxxxsearch {
      hosts => "http://xxxxxx:9200"
      index => "max-cpu-data"
user => "xxxxxx"
password => "xxxxxx"
   }
   stdout {}

}

[root@xxxxxx logstash]#

`*****************`
When i discover above data in kibana Instance ID getting overlapped with instance name and MAXCPU% Coolum is giving Volume type information. 

Can somebody please check and help me to get this issue resolved.

You need to split that into two pipelines. Events from MAXCPU.csv will go through both filter sections and be sent to both output sections. Similarly for EBSInventory.csv.

I was too slow, but whatever. Now I've already written it down. So I might as well post it :slight_smile: ...
I hope that I understood that correctly, but I think that you've got only one configuration file and are wondering why you are overwriting your own data. A Logstash conf file is not a linear program. It goes through all the inputs first, then all the filters and then all the outputs. You'll have to split this up into two configuration files and configure two pipelines in pipelines.yml. Right now Logstash is mixing both configurations and that causes the chaos (two CSV filters parsing the same line)

Hi,

Thanks for your reply. 

I have spilts my both conf file and moved it into conf.d directory : 

[root@xxxxx ~]# cat /etc/logstash/conf.d/aws-cost.conf
input {
    file {
       path => "/home/xxxxx/EBSInventory.csv"
       start_position => beginning
       sincedb_path => "/dev/null"
    }
}
filter {
    csv {
        separator => ","
        columns => ["Volume Name","Volume ID","Volume type","Size","IOPS","Instance ID","Block device","AvailabilityZone","encrypted","State"]

    }
}

output {
   elasticsearch {
      hosts => "http://xxxx:9200"
      index => "inventoryo"
user => "elastic"
password => "xxxxx"
   }
   stdout {}

}

[root@xxxxx ~]#

************* 
[root@xxxxxx ~]# cat /etc/logstash/conf.d/maxcpu-usage.conf
input {
    file {
       path => "/home/xxxxxx/MAXCPU.csv"
       start_position => beginning
       sincedb_path => "/dev/null"
    }
}

filter {
    csv {
        separator => ","
        columns => ["Sl.No","InstanceName","InstanceID","MAXCPU%","MINCPU%"]

    }
}

output {
   elasticsearch {
      hosts => "http://xxxx:9200"
      index => "inventoryo"
    user => "elastic"
    password => "xxxxx"
       }
       stdout {}

    }
    [root@xxxxxx ~]#
    [root@xxxxxx ~]#

************ PFB pipeline.yml**************
[root@xxxxx ~]# cat /etc/logstash/pipelines.yml
# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
#   https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html

#- pipeline.id: main
#  path.config: "/etc/logstash/conf.d/*.conf"

- pipeline.id: ebs-pipeline
  pipeline.workers: 1
  pipeline.output.workers: 1
  path.config: "/etc/logstash/conf.d/aws-cost.conf"

- pipeline.id: cpu-pipeline
  pipeline.workers: 1
  pipeline.output.workers: 1
  path.config: "/etc/logstash/conf.d/maxcpu-usage.conf"

After doing above configuration i have restarted logstash services and it came up successfully. But I am unable to view my index template in kibana. (Also I wanted to use same index for both my csv files)

Also one more things I wanted to ask , if I keep my conf files in /etc/logstash folder then what command to i have to use to start my logtsash pipeline.

Hi Team- please help me resolve this issue.

Thanks in advance.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.