How to run logstash as a service in CentOS

I am able to run logstash successfully using below:

/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/app.conf

But it is not working, when i run following commands

/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/app.conf &
nohup /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/app.conf &

Thanks in advance

Hi @ramanna_hk,
wich version of Centos are you using?

version 7.5

Did you try with the systemctl commands?

start

systemctl start logstash

stop

systemctl stop logstash

enable on boot

systemctl enable logstash

Yes @pge

and pipline.yml file includes:

  • pipeline.id: main
    path.config: "/usr/share/app.conf"

logstash is running but data is not indexing.

I think that this is a different issue.....

  1. if logstash is correctly configured systemctl should be enough to run logstash. If you don´t have a index you have to check your configuration file on /etc/logstash/conf.d

  2. why you have to run a command line like that?

    /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/app.conf &
    nohup /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/app.conf &

Thank you @pge

  1. No problem in configuration file.
  2. I want to run logstash as a service(in background) in production machine.
1 Like

@ramanna_hk on centos 7 systemctl is the default command to run services.

Remember also that running logstash as a service include many setting (for example /etc/systemd/system/logstash.service or /etc/default/logstash ) then I use the command line like "/usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/app.conf" just for testing, otherwise systemctl.

If logstash doen´t work correctly with systemctl I think that you got a different problem

You need to first configure startup.options and then run logstash/bin/system-install first, in order to add the files for systemd.

https://www.elastic.co/guide/en/logstash/current/config-setting-files.html

1 Like

default configurations in startup.option and also when i run logstash using systemctl start logstash

logs of logstash are:
[2018-07-19T11:26:14,579][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"localhost:5044"}
[2018-07-19T11:26:14,876][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2018-07-19T11:26:14,983][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x544f63a5 sleep>"}
[2018-07-19T11:26:15,434][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-07-19T11:26:16,291][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

But data is not indexing

Please tell me what should i change in configuration

Thank you

Hi @ramanna_hk,
may you post your /etc/logstash/conf.d/* configuration?

Note: same conf is working by this command /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/app.conf

beats {
port => 5044
host => "localhost"
}
}
filter {
csv {
separator => ","
columns => [
"Machine_id",
"Time",
"CRCErrors_Current_Value",
"CRCErrors_Severity",
"CRCErrors_Min_Current_Value",
"CRCErrors_Max_Current_Value",
"DownlinkJitter_Current_Value",
"DownlinkJitter_Severity",
"DownlinkJitter_Warning.Threshold",
"DownlinkJitter_Critical.Threshold",
"DownlinkJitter_Min_Current_Value",
"DownlinkJitter_Max_Current_Value",
"DownlinkRSSI_Current_Value",
"DownlinkRSSI_Severity",
"DownlinkRSSI_Warning.Threshold",
"DownlinkRSSICritical.Threshold",
"DownlinkRSSI_Min_Current_Value",
"DownlinkRSSI_Max_Current_Value",
"DownlinkUtilization_Current_Value",
"DownlinkUtilization_Severity",
"DownlinkUtilization_Min_Current_Value",
"DownlinkUtilization_Max_Current_Value",
"Frequency_Current_Value",
"Frequency_Severity",
"Frequency_Min_Current_Value",
"Frequency_Max_Current_Value",
"Latency_Current_Value",
"Latency_Severity",
"Latency_Warning.Threshold",
"Latency_Critical.Threshold",
"Latency_Min_Current_Value",
"Latency_Max_Current_Value",
"PacketDrop_Current_Value",
"PacketDrop_Severity",
"PacketDrop_Warning.Threshold",
"PacketDrop_Critical.Threshold",
"PacketDrop_Min_Current_Value",
"PacketDrop_Max_Current_Value",
]
}
date {
match => [ "Time", "YYYY/MM/dd-HH:mm:ss", "YYYY/MM/dd-HH:mm:ss.S-SS " ]
target => "Time"
}
mutate {convert => ["CRCErrors_Current_Value","float"]}
mutate {convert => ["CRCErrors_Min_Current_Value","float"]}
mutate {convert => ["CRCErrors_Max_Current_Value","float"]}
mutate {convert => ["DownlinkJitter_Current_Value","float"]}
mutate {convert => ["DownlinkJitter_Warning.Threshold","float"]}
mutate {convert => ["DownlinkJitter_Critical.Threshold","float"]}
mutate {convert => ["DownlinkJitter_Min_Current_Value","float"]}
mutate {convert => ["DownlinkJitter_Max_Current_Value","float"]}
mutate {convert => ["DownlinkRSSI_Current_Value","float"]}
mutate {convert => ["DownlinkRSSI_Warning.Threshold","float"]}
mutate {convert => ["DownlinkRSSI_Min_Current_Value","float"]}
mutate {convert => ["DownlinkRSSI_Max_Current_Value","float"]}
mutate {convert => ["DownlinkUtilization_Current_Value","float"]}
mutate {convert => ["DownlinkUtilization_Min_Current_Value","float"]}
mutate {convert => ["DownlinkUtilization_Max_Current_Value","float"]}
mutate {convert => ["Frequency_Current_Value","float"]}
mutate {convert => ["Frequency_Min_Current_Value","float"]}
mutate {convert => ["Frequency_Max_Current_Value","float"]}
mutate {convert => ["Latency_Current_Value","float"]}
mutate {convert => ["Latency_Warning.Threshold","float"]}
mutate {convert => ["Latency_Critical.Threshold","float"]}
mutate {convert => ["Latency_Min_Current_Value","float"]}
mutate {convert => ["Latency_Max_Current_Value","float"]}

}
output {
if [type] == "cpe9101"{
stdout { codec => rubydebug }
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "cpe9101-%{+YYYY.MM.dd}"
document_type => "cpe9101"

  }
}
if [type] == "cpe9102"{
   stdout { codec => rubydebug }
   elasticsearch {
       action => "index"
       hosts => ["localhost:9200"]
       index => "cpe9102-%{+YYYY.MM.dd}"
       document_type => "cpe9101"

  }
}
if [type] == "cpe9103"{
   stdout { codec => rubydebug }
   elasticsearch {
       action => "index"
       hosts => ["localhost:9200"]
       index => "cpe9103-%{+YYYY.MM.dd}"
       document_type => "cpe9101"
  }
}

}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.