Logstash not indexing in Solr

I'm trying to implement the following architecture:

Apache Log --> Filebeat --> Logstash --> Solr

Apache and Filebeat run in the same VM.
Logstash and Solr run each one on its own VM.

I know by the logs that Filebeat is sending data to Logstash.
I also know that Logstash comunicates with Solr.
But Solr does not index document, that is, Num. Docs are always 0.

The Logstash conf doc is as follow:
> input {

   beats {
     type => beats
     port => 5044
filter {
   grok {
      match => { "message" => "%{COMMONAPACHELOG}" }
   mutate {
      remove_field => [ "offset","ident","input_type","source","message","type","tags","@version","beat","host"]
output {
   stdout { codec => rubydebug }
   file {
       path => "/var/log/logstash/samuel.log"
   solr_http {
      solr_url => ""

Is there anywhere I should see?

Have you had a look at your solr logs. And, or what may sit between logstash, Solr? What issues are you seeing?

I found the folowing exception in the /opt/lucidworks/fusion/var/log/solr/solr.log.

It seems the problem is with Fusion not indexing the data.

2017-01-30T17:46:22,745 - WARN [qtp1766822961-21:LukeRequestHandler@607] - {node_name=n:} - Error getting file length for [segments_c]
> java.nio.file.NoSuchFileException: /opt/lucidworks/fusion/data/solr/tomcat1_shard1_replica1/data/index/segments_c
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86) ~[?:1.8.0_102]
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_102]
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107) ~[?:1.8.0_102]
at sun.nio.fs.UnixFileAttributeViews$Basic.readAttributes(UnixFileAttributeViews.java:55) ~[?:1.8.0_102]
at sun.nio.fs.UnixFileSystemProvider.readAttributes(UnixFileSystemProvider.java:144) ~[?:1.8.0_102]
at sun.nio.fs.LinuxFileSystemProvider.readAttributes(LinuxFileSystemProvider.java:99) ~[?:1.8.0_102]
at java.nio.file.Files.readAttributes(Files.java:1737) ~[?:1.8.0_102]
at java.nio.file.Files.size(Files.java:2332) ~[?:1.8.0_102]

So not so much of a Logstash issue. I would ask that you take a look into your Solr implementation.

Good day,

I'm facing the same problem trying to get Logstash to output to Solr. Would you mind sharing the steps and/or schema.xml you used to create the Solr Collection that logstash-output-solr_http is sending to?

Thanks much,

Hi Davis
Are you able to resolve Solr issue? I am trying to create same pipe line with Logstash but don't see any document in solr. Can you please share the step which I need to follow to make it working?

Yes, it feels very lonely trying to use the SILK logging stack instead of the much more popular ELK stack. Here's what I needed to do for Logstash to Solr on CentOS 7:

Assuming you've already installed Logstash on CentOS or similar
sudo /usr/share/logstash/bin/logstash-plugin install logstash-output-solr_http
# Fix for Solr6 & Logstash 5: timestamp error
sudo sed -i.bak -e 's/document["@timestamp"]/#document["@timestamp"]/g' /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-solr_http-3.0.1/lib/logstash/outputs/solr_http.rb

Fix for https://github.com/logstash-plugins/logstash-output-solr_http/issues/3

Note, I could instead have added a soft commit below the add: @solr.commit :commit_attributes => {:softCommit => true}

sudo sed -i -e 's/@solr.add(documents)/@solr.add(documents, :add_attributes => {:commitWithin=>1000})/g' /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-solr_http-3.0.1/lib/logstash/outputs/solr_http.rb\

Basically, the version of this plugin in Github fails to ever commit so no log events are ever searchable.

Hope this helps,

1 Like

thanks for reply. Do I still need to run the command which you have mentioned even I don't see any error in Logstash. One more thing I would like to mention that I am using third party docker image of logstash. Do I need to add any additional configuration file like schema.xml n all as I don't see any such files.

Hiya Rai,

I'm using the RPM logstash on CentOS 7 so can easily change this plugin. Would be trickier with Logstash in Docker.

Skip the first half of my diff about iso8601 issues if you're not seeing that problem. The 2nd part is the Solr commit to make log events appear in Silk.

As for Solr schema.xml, I'm not using one explicitly, instead relying upon dynamic fields from logstash. My Logstash pipeline has a mutate to rename @timestamp to timestamp_dt:

filter {
  mutate {
    rename => { "@timestamp" => "timestamp_dt" }
output {
    #stdout { codec => rubydebug }
  solr_http {
    solr_url => "http://SOLR_HOST:8983/solr/logs"


To create the dynamic Solr collection named 'logs' I run:

sudo -u solr /opt/solr/bin/solr create -c logs -d /opt/solr/server/solr/configsets/data_driven_schema_configs/


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.