How to run conf file in two different machines/nodes?

Hi All

I have two conf file confile1.conf and confile2.conf
The confile1.conf is sitting on Machine 1
The confile2.conf is sitting on Machine 2

Now i want to operate everything from Machine 1
From Machine 1 , i am able to run the "confile1.conf" as a service
How can i run the "confile2.conf" from Machine 1 itself. ?

I checked the docs, we have feature of node, but how both the nodes can communicate each other.
--node.name NAME

How to use this i am not sure. My end goal is to run the "confile2.conf" from machine 1.

This i want to do because the volume of records is huge and it consumes lot of processing power.So i need one machine to run my conf file for various nodes.

Logstash does not cluster so you need to move the config file over to the machine where you want to run it.

Hi @Christian_Dahlqvist thanks for your response.

I see in the yml file we have
--node.name NAME
--http.host HTTP_HOST
--http.port HTTP_PORT

I will keep the config file "confile2.conf" in machine 1 and run the command as usual but in my yml file i am trying to give value like this

--node.name machine2_node
--http.host machine2_host
--http.port 7001

now basically i am running the conf file in machine1, and actually it will utilize the ressource of machine2, is this not possible?
I feel this is something possible by looking the yml file.

Currently i stuck with not able to bind the remote host, that is IP address of machine2 in
yml file. Any idea on this

This is more like a running a conf file from another machine, i just want to bind the host name, i think then it will work

Thanks

Logstash does not work in a clustered mode so that is not possible. The closest thing is probably centralized pipeline management where pipelines are managed through Kibana and Elasticsearch.

do we have any other solution to reach there....

Why can you not copy the files onto the appropriate hosts?

like this i need to run lot of conf files in different machines, also i need to change the filters by date every week. So i want to manage everything centrally

May be i will have to use some server level programming language to generate conf file from central machine to the remote machine

I was thinking it will work like ES

If you have a large number of config files you could manage and deploy them using a orchestration tool, e.g. Ansible.

It does not, but another option might be to have a look at ingest node pipelines and do the processing in Elasticsearch. Whether this is possible or not depends a lot on what input and output plugins you use.