Setup questions for custom logs from application into logstash

I am very new to ELK stack and have successfully setup one. Currently, i am learning how to parse custom application logs into Logstash pipeline then to elasticsearch and hope to see these data on kibana.
Hence I have a few questions if my process/way of implementing is ideal/correct.

Just for reference, one of my sample log data is:

2019-05-28 04:19:01.355 DEBUG - [SPLOC, 00082] [common_template.c,App_endTransaction,822] [Time to Commit Trans=1.223000]
2019-05-28 04:19:01.355 INFO - [SPLOC, 00082] [common_msg_hdlr.c,CommonMsgHdlr,153] [END_TRANS, Status: STATUS_OK]
2019-05-28 04:19:01.355 INFO - [SPLOC, 00082] [common_msg_hdlr.c,CommonMsgHdlr,212] End transaction. [Time Taken=1.764000].

Please advice/correct me if i am wrong.

  1. Create a pipeline on logstash
  2. Filebeat this file from an host into logstash with the same port as the pipeline in (1)
  3. At /usr/share/logstash/config/conf.d/test.conf, configure the filter to grok the data in
  4. Output to elasticsearch host with an index

However i am stuck at step 3. I am unable to start my pipeline (testpipe). Starting the logstash service always creates the default main pipeline.
How can i start my customized pipeline?

What i did:
Edit pipelines.yml

- testpipe
  path.config: "/usr/share/logstash/config/conf.d/test.conf"

Edit test.conf (this configuration is wrong but i am unable to start this pipe)

input {
  stdin {}
  #beats {
  #  port => 5044

# The filter part of this file is commented out to indicate that it is
# optional.
filter {
  grok {
    match => { "message" => "%{COMBINEDAPACHELOG}" }
    remove_field => "message"
  date {
    match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
    locale => en
    remove_field => ["timestamp"]
  geoip {
    source => "clientip"
  useragent {
    source => "agent"
    target => "useragent"

output {
    stdout { codec => rubydebug }
    elasticsearch {
     hosts => ["http://esnode1:9200", "http://esnode2:9200", "http://esnode3:9200"]
     index => "testpipe-%{+YYYY.MM.dd}"

then i ran this

../bin/logstash -f conf.d/test.conf --config.test_and_exit
 ../bin/logstash -f conf.d/test.conf --config.reload.automatic

Rather than deleting your question, it would be better if you could share your solution, as it may help others in the future with a similar problem :slight_smile:

The solution to starting my own pipe is to place the pipeline configuration file at /etc/logstash/conf.d/<here>

My mistake was that i place it at the/usr/share/logstash/ directory.
I found this solution after carefully reading the documentation here

On deb and rpm, you place the pipeline configuration files in the /etc/logstash/conf.d directory. Logstash tries to load only files with .conf extension in the /etc/logstash/conf.d directory and ignores all other files.

1 Like

Awesome, thank you very much for sharing that :smiley:

1 Like