Pipeline to ingest JSON files with old objects from old cluster

Hi,

I still very new to ELK and need help. We had an old cluster with 2 logstashs, 3 elasticsearchs, and 2 kibana. They are running version 6.23. We built a new cluster and install version 7.15.

We want to configure a new cluster but with the objects and pipelines from the old one. How can we accomplish this?

Below are the steps we have followed so far:

  • We copied the old pipelines ( the .config files) into kibana -->Management-->pipeline and add the pipeline ids to the logstash.yml.

  • We didn't do a backup of the data, we wanted to start storing new data.

  • We export ALL the objects (index patterns, dashboards, etc.) from the old cluster to the new cluster because we want to keep the same index patterns, dashboards, etc

  • We created a test pipeline to test our input and filter. Then the output sent the json file parsed to a folder and it worked.

This is where we need help.

After the test we used the old pipeline that has the output is below but we don't see data in Kibana and the logstash logs don't tell us anything. We didn't change anything in the pipeline, we even want to use the same index names.

output {
  elasticsearch { hosts => ["dbes-01","dbes-02", "dbes-03"] 
  index => "myindex"
  password="password"
  username= "elastic"
   }


Do we need to create the index "myindex" in the Elasticsearch server or by just having it in the pipeline this one takes care of created? Do I need to do something with the search pattern?

Thanks in advance

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.