HOW TO logstash convert all fields to integer(or some fields)

Hello I am a foreign developer. Even if the translation is awkward, please understand

My process is Elastic >> logstash >> fileBeat >> Kibana

Logstash reads and uses csv file. The flow creates a csv file containing the headers, and then the data is stacked line by line.

The header of the csv file is "no, item, cherry, apple". All subsequent values ​​are numeric.
ex) no, item, cherry, apple
1, 2, 3, 4

The first header is all text. I want to change all header types to integers when reading. "no" and "item" are fixed, but the "cherry" and "apple" are fluid, so they can't fix csv columns What method should I use? Should I use "ruby filter"? I don't know ruby ​​yet

This is my script now. Thank you for letting us know what you need to add or edit.

input {

  beats {
    host => "localhost"
    port => "5044"


filter {

  csv {
    #separator => ","
    #autogenerate_column_names => true
    skip_header => true
    autogenerate_column_names => true
    autodetect_column_names => true

  mutate {
      remove_field => [
   #convert => ["[%{field}][long]", "integer"]

  grok { 
    match => { "message" => "%{COMBINEDAPACHELOG}" }

  date {
     #match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
     match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
     locale => "ko"

output {
  elasticsearch {
    hosts => "http://localhost:9200"
    index => "blank"
    manage_template => false

Does the convert function in the csv filter work for you:

  csv {
    convert => {
      "column1" => "integer"
      "column2" => "boolean"

In my case I have a template when I write to ES that sets my datatype based on field name.

1 Like

I knew it. These fields not fixed, dynamic. I want how to convert integer to dynamic all fileds .

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.