How to read json file as input and index the data using elasticsearch as output?

Hi,

I have a file having Json data. I have an index "company" and document-type as "employee" present in my local elasticsearch server. I am unable to index the data from the file and send it to elasticsearch server.

config file name: logstash_test.conf
content in config file:
input{
file {
path => "/opt/test.log"
type => "json"
start_position => "beginning"
codec => "json"
}
}

output {
elasticsearch { hosts => ["localhost:9200"]
index => "company"
document_type => "employee"
document_id => "%{@timestamp}"
}
stdout { codec => rubydebug }
}

My json data file content is:
{{"first_name":"r1","last_name":"j1","dob":"1922-05-09"},
{"first_name":"r2","last_name":"j2","dob":"1944-05-09"}}

Please help me out in resolving this issue.

Have you checked the Logstash logs for clues? To debug, comment out the elasticsearch output so rule out one error source. It's quite likely that Logstash is tailing the log file, in which case deleting the sincedb file or setting sincedb_path to /dev/null will help. Read the file input documentation to learn more about sincedb.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.