I'm trying to implement the following architecture:
Apache Log --> Filebeat --> Logstash --> Solr
Apache and Filebeat run in the same VM.
Logstash and Solr run each one on its own VM.
I know by the logs that Filebeat is sending data to Logstash.
I also know that Logstash comunicates with Solr.
But Solr does not index document, that is, Num. Docs are always 0.
I found the folowing exception in the /opt/lucidworks/fusion/var/log/solr/solr.log.
It seems the problem is with Fusion not indexing the data.
2017-01-30T17:46:22,745 - WARN [qtp1766822961-21:LukeRequestHandler@607] - {node_name=n:172.27.1.108:8983_solr} - Error getting file length for [segments_c] > java.nio.file.NoSuchFileException: /opt/lucidworks/fusion/data/solr/tomcat1_shard1_replica1/data/index/segments_c
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86) ~[?:1.8.0_102]
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102) ~[?:1.8.0_102]
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107) ~[?:1.8.0_102]
at sun.nio.fs.UnixFileAttributeViews$Basic.readAttributes(UnixFileAttributeViews.java:55) ~[?:1.8.0_102]
at sun.nio.fs.UnixFileSystemProvider.readAttributes(UnixFileSystemProvider.java:144) ~[?:1.8.0_102]
at sun.nio.fs.LinuxFileSystemProvider.readAttributes(LinuxFileSystemProvider.java:99) ~[?:1.8.0_102]
at java.nio.file.Files.readAttributes(Files.java:1737) ~[?:1.8.0_102]
at java.nio.file.Files.size(Files.java:2332) ~[?:1.8.0_102]
I'm facing the same problem trying to get Logstash to output to Solr. Would you mind sharing the steps and/or schema.xml you used to create the Solr Collection that logstash-output-solr_http is sending to?
Hi Davis
Are you able to resolve Solr issue? I am trying to create same pipe line with Logstash but don't see any document in solr. Can you please share the step which I need to follow to make it working?
Thanks
Rai
Yes, it feels very lonely trying to use the SILK logging stack instead of the much more popular ELK stack. Here's what I needed to do for Logstash to Solr on CentOS 7:
Assuming you've already installed Logstash on CentOS or similar
sudo /usr/share/logstash/bin/logstash-plugin install logstash-output-solr_http
# Fix for Solr6 & Logstash 5: timestamp error
sudo sed -i.bak -e 's/document["@timestamp"]/#document["@timestamp"]/g' /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-solr_http-3.0.1/lib/logstash/outputs/solr_http.rb
thanks for reply. Do I still need to run the command which you have mentioned even I don't see any error in Logstash. One more thing I would like to mention that I am using third party docker image of logstash. Do I need to add any additional configuration file like schema.xml n all as I don't see any such files.
I'm using the RPM logstash on CentOS 7 so can easily change this plugin. Would be trickier with Logstash in Docker.
Skip the first half of my diff about iso8601 issues if you're not seeing that problem. The 2nd part is the Solr commit to make log events appear in Silk.
As for Solr schema.xml, I'm not using one explicitly, instead relying upon dynamic fields from logstash. My Logstash pipeline has a mutate to rename @timestamp to timestamp_dt:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.