Copy metricbeat data from one elasticsearch to another on different machines

Hi,

source: Elasticsearch-8.1.0
destination: Elasticsearch-8.1.0

Both the source and destination are on different machines.
In the source metricbeat indices are created daily.

metricbeat.yml configuration:

output.elasticsearch.index: "metricbeat-%{+yyyy.MM.dd}"
setup.template.name: "metricbeat"
setup.template.pattern: "metricbeat-*"
setup.template.priority: 500
setup.ilm.enabled: false

I need to copy metricbeat data of several days from the above source to destination.
Below is my logstash configuration:

input{
	elasticsearch{
		hosts => "<source-ip>:<source-port>"
		user => "elastic"
		password => "4gnn=f0MifrUWJR8Vq_a"
		index => "metricbeat-2022.03.25"
		ssl => true
		ca_file => "C:/Users/avisriva/Downloads/http_ca.crt"
	}
}
output{
	elasticsearch{
		hosts => "https://<destination-ip>:<destination-port>"
		user => "elastic"
		password => "d2L=pxAjFUH*SdL=Qy9b"
		ssl_certificate_verification => false
		index => "metricbeat-2022.03.25"
	}
	stdout{
		codec => rubydebug
	}
}

I do not see any error in console being generated. I can see output is generated inside the console. But still no data is copied!!
Can anybody help??

Also, when I query https://:/_cat/indices?v
I can see an index is created with the name: .ds-metricbeat-2022.03.25-2022.05.16-000001 but with 0 documents.

Can anybody guide here?

Looks, like I have to use

action => "create"

in the output Elasticsearch section. Now it's working.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.