Hi,
Currently, I have Logstash configured for splitting. The issue is that, sometimes it works, sometimes it goes about 10-15 minutes without any output (it suppose to have an output every 5 minutes).
At the backend, I have scheduled a cronjob to call to an API and then write to a file (JSON format). Logstash will then in turn, read the file and split "result" field.
Here's my Logstash code:
input {
 file {
   start_position => "end"
   path => ["/var/log/logstash/abc_api-*.json"]
   sincedb_path => "/dev/null"
 }
#stdout { codec => rubydebug { metadata => true } }
}
filter {
 json {
   source => "message"
 }
 split {
   field => "result"
 }
#stdout { codec => rubydebug { metadata => true } }
}
output {
  elasticsearch {
  ssl => true
  ssl_certificate_verification => false
  cacert => "/etc/logstash/elasticsearch-ca.pem"
  hosts => "https://10.0.0.2:9200"
  user => "${LS_USER}"
  password => "${LS_PWD}"
  manage_template => true
  index => "abc-logs-%{+YYYY.MM.dd}"
  pipeline => "abc-api"
  }
#stdout { codec => rubydebug { metadata => true } }
 }
I've searched the forum and I think this post helps. But I don't fully understand the "splitData.rb" code, hence, it's not implemented. It seems to be removing the "message" field but I could be wrong.
Also, I have configured Logstash to have 4GB of heap memory. In the array file, it has about 500 items for now (will grow to ~5000). Usually I use Filebeat for all things Elastic (its less resource intensive), but Filebeat does not support splitting feature. So, this is an exception.
