Clarification - ingesting documents using logstash into an existing index

Hello folks,

I want to create the index structure and ingest documents into that structure using logstash.
In the pipeline config file in the attachement, where should I specify the name of structure into which this data should be ingested into? In the output section?

I created the index structure as follows:

 PUT /animals 
{
  "settings": 
  {
    "number_of_shards": 2,
    "number_of_replicas": 2
  },
   "mappings": 
   {
     
   }
  
}

PUT /animals/_mapping/doc
{
  "properties": {
    "animal-type":
    {
       "type": "text"
    },
    "name": 
    {
       "type": "text",
       "analyzer": "standard"
    },
    "legs":
    {
      "type": "integer"
      
    },
    "tail":
    {
      "type": "boolean"
    },
    "eating-type":
    {
      "type": "text"
    }
  }
}

You've created an "animals" index with predefined mappings for various features of animal species, but your Logstash configuration reads a CSV file with car data. This doesn't make any sense.

sorry. I uploaded a wrong logstash file.

Okay. Well, yes, the elasticsearch output controls which ES host to connect to and into which index to put the data. What each event looks like is determined by the inputs and filters you have.

thank you

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.