Auto mapping a csv file

I'm uploading a csv file to elasticsearch by logstash befor that do the mapping manually. How to automate the mapping when the csv file is uploading?

Not sure what you mean here. Dynamic mapping is enabled by default so normally you don't have to specify mappings explicitly. Please explain what you want to do and why.

Correct. But in Dynamic mapping I have to specify the data type of every field in logstash. Otherwise it is mapping as text type value. Am I correct?

No, ES will attempt to guess the mapping type based on the contents. Perhaps you just need to convert the fields to the most suitable type before you send them to ES?

For an Example,
I have a file

+-----------+---------+
| Name      |  age    |
+-----------+---------+
| croos     |  26     |
| nilu      |  30     |
+-----------+---------+   

Would you please give me a example for the mapping of the above csv file.

Field name and data types have to mapped as name (text), age (integer).

If you convert the age field to an integer field (e.g. using the mutate filter's convert option) then ES should map that field as an integer.

1 Like

ya. That is what I'm using now.

input {
  file {
    sincedb_path => "/null"
    path => "\user.csv"
    start_position => "beginning"
    type => "data"
  }
}
filter {
  csv{
    columns => ["name","age"]
    separator => ","
  }
  mutate {
    convert => { "age" => "integer" }
  }
}

But my questios are,

  1. If I not use the mutate filter, Will ES attempt to guess the type of age as integer? (because the two values are looking like integer)

  2. How to mention the first row is field name? so do the mapping for name and age automatically?

  1. Maybe, I'm not sure. Should be easy to test, right?
  2. That's not possible out of the box. The file input doesn't necessarily read the file from the top and it has no knowledge of CSV columns.