LS agent successfully started but ES index not created


I am new here. The first time I ran through the LS guide, first time I push to all data into elasticsearch but my all data point are string & can't do much about it. So delete the index page and try again this time. Logstash is saying agent successfully started, but I did not saw any index page. I thought I did something wrong.

Here is my config file:

input {
file {
path => ["C:\elk_stack\data\Fr_apr_18.csv"]
start_position => "beginning"
type => "logs"
filter {
columns =>["log_id","response_time_in_secs","response_date_time","string","ID","Version","data"]
separator => ","
date {
match => ["response_date_time", "yyyy/MM/dd HH:mm:ss"]
target => "response_date_time"
mutate {convert => [ "log_id", "integer" ]}
mutate {convert => [ "response_time_in_secs", "float" ]}
mutate {convert => [ "ID", "integer" ]}
output {
elasticsearch {
hosts => "localhost"
index => "logstash-test-1"
stdout { }

Have a look at this blog post which walks you through how to work with Logstash and get your data into Elasticsearch in the expected format.

Thanks for responding @Christian_Dahlqvist

I am using ELK 5.1.2 due to my system is windows 7 32 bit.

And I did not get what I did wrong, I already go through this tut.

What is the result if you remove the index name from your Elasticsearch output plugin and index into the default Logstash index?


Now I am finding this error.
whether I am using index name or not doesn't matter

did you enter your conf file path in logstash.yml ?

Put your elasticsearch port number also in conf file .

output {
stdout{ codec => rubydebug }
elasticsearch { hosts => [""]
index => "logstash-test-1"

"stdout{ codec => rubydebug }" by this command you can see the parsed log when logstash started .

@rijinmp I thought I miss this one.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.