Logstash is keep on running

Hi,

I am reading data from CSV filetest and dumping to Elasticsearch. Data is getting done properly and index is also getting created but Logstash process never ends. It has to be killed manually. please do help as the same i have to implement in my project.

Below is logstash config file:-

input {
file {
path => "/alt/deepak/logstash-6.2.4/s3bucket/test.csv"
start_position => "beginning"
sincedb_path =>"/dev/null"
}
}
filter {
csv{
separator => ","
columns => [ "emp_id", "ename", "mgr_id", "salary", "deptno" ]
}
mutate { convert => ["emp_id", "integer"] }
mutate { convert => ["mgr_id", "integer"] }
mutate { convert => ["deptno", "integer"] }
if [message] =~ /^emp_id/ {
drop {}
}
}

output {
elasticsearch {
hosts => "localhost"
#document_id => "%{org_id}%{connection_id}"
document_type => "emp_data"
index => "emp_data_index_%{+YYYY_MM_dd}"
}
stdout{}
}

Thanks,
Deepak

A file input is designed for tailing log files. So when logstash starts, even if you tell it to read the entire file using

start_position => "beginning"
sincedb_path =>"/dev/null"

it will then wait forever to see if more lines are appended to the file. Like "tail -f". This is working as expected.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.