Indexing in Kibana


#1

Hi,
I am new to ELK and have just installed the ELK stack v5.5.1.
the ELK stack seems to be working fine and I am able to access the Kibana interface however I am having a problem in trying to create an index in Kibana.

Currently my Logstash conf file is:
input {
file {
path => "/input/*.csv"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["Event Time", "Device", "deviceEventClassId", "deviceCustomString2", "sourceUserName", "filePath", "fileSize"]
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "logstashfeed1"
}
stdout {codec => rubydebug}
}

I havent created an index in Kibana yet, however from what I understand I can create an index in the Logstash conf and this should transfer to Kibana so that Kibana recognises this. It also doesnt seem to recognise the .csv files which I am trying to upload into Kibana.

I'd appreciate any help asap and any pointers especially if i am missing something blatantly obvious.
Thanks
Matt


(Prachi Mishra) #2

check this url - http://localhost:9200/_cat/indices
if your indices are uploaded .


#3

yellow open .kibana xFw6zNUSR3S9oMbu0u5zHQ 1 1 2 0 5.3kb 5.3kb

This is the output, is it possible to create one from the logstash conf which will execute when logstash is started?


(Prachi Mishra) #4

can you remove this line from from your output and try ?
If your configuration is correct you will find following similar output in http://localhost:9200/_cat/indices

yellow open logstash-2017.07.25 U2iZ-sY5SHiH7Kdm51kj7Q 5 1 18978 0 7.6mb 7.6mb


#5

Unfortunately this hasn't worked either, my output is:
yellow open .kibana xFw6zNUSR3S9oMbu0u5zHQ 1 1 3 0 13kb 13kb

Not entirely sure how the size of the .kibana index has increased, however i'm still having issues creating the index and inputting the .csv files. My configuration seems to be normal compared to other examples which I have seen - therefore leaving me confused as to why it isn't recognising the files or an index


(Prachi Mishra) #6

debug the logstash and see if any error coming for parsing the .csv files .
below is the command -
.\logstash -f C:\your-path\logstash\config\logstash.conf --debug


#7

Not quite sure if you'll be able to read the output however this is what it is outputting. I have also tried putting just one of the csv's in the Bin folder where the config is stored to see if this is an issue however it doesnt seem apparent if it is or not.


(Prachi Mishra) #8

I don't see any error in the logs though ..I have few observations find below

  1. Can you confirm one thing you have given logstash.conf path in logstash.yml right ?
    like this -
    path.config: C:\ELK-stack\logstash\logstash\config\logstash.conf
    config.reload.automatic: true

  2. You can try one thing by giving absolute path of your .csv files.

  3. Share your sample csv file for better understanding .


#9

Just altered the logstash.yml file and it is now throwing this error:


Throughout I have also now included the absolute path. Leaves me to guess that theres an issue with the logstash.yml file however im not entirely sure where


#10

Researched the error and found I had an extra space in the .yml file than what was needed! However after restarting all the components of ELK it still hasnt worked... :slightly_frowning_face:


(Prachi Mishra) #11

I faced similar issue for index but after correcting logstash.conf the index got created .
I did for logs not csv file . I guess you need to change in filter { } block .
If its right then your csv files will get parsed and you can check that in logs as well .


#12

All has worked!!! Thank you for the help!

Problem was that the .csv file I was trying to get ELK to injest wasn't fully separated with commas (ELK didnt like some of the spaces and tabs that were included)
Thanks though!


(system) #13

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.