Importing JSON readings from file to ElasticSearch

Hello, I have script that logs temperature + humidity from diffrent sensors and stores the data from each sensor to his directory and every day a new log is made in this format YYYY-MM-DD.log.

${data_root}/A/0/*.log
${data_root}/A/1/*.log
...

the logs are in a JSON format and here is an example line:
{"timestamp": "2018-03-02T00:00:29.451416", "temperature": "24.30", "humidity": "25.30"}

I had trouble with understanding how to correctly config my logstash instance, I figured that my input should look something like this:

input {
 file{ path => "/var/wlogs/a1/*.log" codec=>json_lines type=>"a1"}
 file{ path => "/var/wlogs/a2/*.log" codec=>json_lines type=>"a2"}
etc..
}

im having trouble with exporting I cant even figure out where to start.
I want to export every reading to a index that relates to the type(a1,a2,etc..)
please help me :slight_smile:
thanks in advance

as per my understanding, requirement is to read logs from log folder distinguished with type as folder name. (i,e a1, a2, a3..) and you need this type as index name. for this u can try:

input {
 file{ path => "/var/wlogs/a1/*.log"  codec=>json_lines  type=>"a1"}
 file{ path => "/var/wlogs/a2/*.log"  codec=>json_lines  type=>"a2"}
etc..
}
filter{

}

output {
  stdout { codec => rubydebug }
  elasticsearch {
	hosts => "localhost:9200"
	index => "%{type}"
  }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.