Logstash cant start the pipeline in the conf.d file

Hello everyone,
when i use the command /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/simple.conf
here is what i get

'[INFO ] 2022-05-07 09:58:02.271 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[INFO ] 2022-05-07 09:58:02.665 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@localhost:9200/]}}
[WARN ] 2022-05-07 09:58:02.946 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://elastic:xxxxxx@localhost:9200/"}
[INFO ] 2022-05-07 09:58:02.962 [[main]-pipeline-manager] elasticsearch - Elasticsearch version determined (7.17.3) {:es_version=>7}
[WARN ] 2022-05-07 09:58:02.965 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[INFO ] 2022-05-07 09:58:03.024 [Ruby-0-Thread-10: :1] elasticsearch - Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[INFO ] 2022-05-07 09:58:03.025 [[main]-pipeline-manager] elasticsearch - Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[WARN ] 2022-05-07 09:58:03.038 [[main]-pipeline-manager] grok - Relying on default value of `pipeline.ecs_compatibility`, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode.
[INFO ] 2022-05-07 09:58:03.083 [Ruby-0-Thread-10: :1] elasticsearch - Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[INFO ] 2022-05-07 09:58:03.293 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/conf.d/simple.conf"], :thread=>"#<Thread:0x4aeb4e5f run>"}
[INFO ] 2022-05-07 09:58:04.665 [[main]-pipeline-manager] javapipeline - Pipeline Java execution initialization time {"seconds"=>1.37}
[INFO ] 2022-05-07 09:58:04.684 [[main]-pipeline-manager] beats - Starting input listener {:address=>"0.0.0.0:5044"}
[INFO ] 2022-05-07 09:58:04.730 [[main]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"main"}
[INFO ] 2022-05-07 09:58:04.831 [Agent thread] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[ ]}
[INFO ] 2022-05-07 09:58:04.877 [[main]<beats] Server - Starting server on port: 5044
'

it stacks here and its not working
here is my simple.conf

input {
beats {

       port => 5044 
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date { 
match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}  


output { 
elasticsearch {
       hosts =>["localhost:9200"]
    user     => "elastic"
    password => "fPAbmCodQi6q390fLLU3"
  
}
stdout {codec => rubydebug }
}

logstash.yml

 xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.username: logstash_system
xpack.monitoring.elasticsearch.password: Dv5cvj9uTdxMSD1jOgiv
'

before enabling xpack all was good.


Hi @ahmed_barki Welcome to the community.

1st what version of the stack are you using? Filebeat, Logstash Elasticsearch

2nd The pipeline is started and waiting on input from the beats... until it receives some are you sending and data through beats?

3rd what do you mean it works before you turn on monitoring

4th Are you trying to use data streams or normal indices?

5th This has bad syntax

 xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.username: logstash_system
xpack.monitoring.elasticsearch.password: Dv5cvj9uTdxMSD1jOgiv
' <! -- What is this extra Mark?

(EDITED) waiting on answers above

Hello @stephenb
1- I'm using logstash 7.17.3, Elasticsearch 7.17.3, and Filebeat 7.17.3
2- filebeat should send data to logstash(5044) .i commented out the Elasticsearch output in the filebeat.yml
3- Before enabling x pack on Elasticsearch and setuping passwords to Beats, logstash, Elasticsearch, and kibana the same config file was working with no problem

input {
beats {

       port => 5044 
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date { 
match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}  


output { 
elasticsearch {
       hosts =>["localhost:9200"]
    
  
}
stdout {codec => rubydebug }
}

i just added Elasticsearch passwords
4- normal indices
5- i just added this extra mark by a mistake when i was typing in the discuss
thanks for your answer.

Thanks for the details..

Ahh... What exactly did you setup up / enable with xpack in elasticsearch? Authentication, HTTPS etc? Did Kibana work after that as well? I am confused when you mentioned monitoring (perhaps we can leave that for part II )

After you set up xpack (security) that could you connect with curl...

Can you share your elasticsearch.yml

What did the output section of the logstash.conf look like after you added xpack?

After adding xpack i used this command
bin/Elasticsearch-setup-passwords auto.

I can login to kibana using elastic user.(kibana is working i can visualize logs via metricbeat )

I added this to my conf file output

user     => "elastic"
    password => "fPAbmCodQi6q390fLLU3"
  

Thank you for your help.

1 Like

Apologies @ahmed_barki Is it working now?

No, its not. The same problem
When i use usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/simple.conf it stack on

[INFO ] 2022-05-07 09:58:04.877 [[main]<beats] Server - Starting server on port: 5044

LS has been started and waiting for a stream.
Check is your firewall active on LS.
Also have a look on FB side, enable debug. Maybe a file is already parsed.

1 Like

Yes, what @Rios said is very important

if you want to send the same file again, you have to remove the data directory in filebeat because keeps track of what you've already loaded So it won't send the same file (s) again To me. It looks like logstash is up and running and waiting for more log lines from filebeat.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.