File input plugin not sending to elasticsearch

Hi,

I am trying to index AWS CloudTrail logs in elastic search using the file input plugin for logstash.

The format of the JSON file it is reading is multiline events of json, separated by new line. From my understanding for this reason I should be using json_lines codec.

Here is input/output configuration:

input {
file {
path => "/home/centos/cloudtrailproject/cloudtraillog.json"
codec => "json_lines"
}
}

output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "cloudtrail"
user = > omitted
password => omitted
}
}

I can't really tell if the input plugin is running or not, but I am assuming it indexes any time new info is appended to the file. I am not seeing any errors in the logstash logs. Also not seeing data in the index regardless of time filter. I even tried creating the index in kibana without a time filter and it still did not show any data so my assumption that the data is not making it into elasticsearch (however, the index is created in elasticsearch and in kibana you can see all of the fields that would be indexed if the data were displaying). I also made sure the service logstash is running as has read permissions on the file.

Can anyone point my in the right direction here or have any idea of what might be going wrong?

Thanks.

Divide and conquer. Use a trivial output plugin like file or stdout until you've established that the input data is being read correctly. Only then should you enable the elasticsearch output and work on getting that part of the pipeline working.

1 Like

Hi Magnus, thanks for your reply, and sorry for the delayed response.

I used trivial output like stdout and I can successfully see the JSON lines being read from the file in the stdout, however when I switch back to the elastic output posted above, I still am not seeing any results. (I do see the lines being read in debug output, but nothing as far as i can tell is actually being put into elasticsearch)

This is surely coming from my lack of experience with ELK stack, do you have any tips for why this might be happening? Does the index that the output writes to have to already exist? Is it possible it is successfully putting the data into elasticsearch, but there is an issue with my discover search? Thanks for any ideas you may have.

This is surely coming from my lack of experience with ELK stack, do you have any tips for why this might be happening?

Not sure what's going on here, but I'd be surprised if the elasticsearch output doesn't log anything at all if you increase the log level.

Does the index that the output writes to have to already exist?

No, indexes will be created implicitly when documents are added to them.

Is it possible it is successfully putting the data into elasticsearch, but there is an issue with my discover search?

Oh, sure. There are several Elasticsearch REST APIs that you can use to see what's going on in ES to avoid getting fooled by Kibana, e.g. the cat indices endpoint.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.