Logtstash not recognizing the columns created in Geojson format

Hi, I have fields like A_Latitude, A_Longitude, B_Latitude and B_Longitude. I would like to make use of this data and create Maps in Kibana. The problem is data is getting into elasticsearch, but the gejson columns created in Logstash filter not gettin recognized and data is not being fed into geo_point1 and geo_point2.

Hence, first created a geo_point mapping in Kibana dev tools as follows,
PUT cc-test
{
"mappings": {
"properties": {
"geo_point1":{
"type": "geo_point"
},
"geo_point2":{
"type": "geo_point"
}
}
}
}

I have configured my logstash config file the following way,

input {
        jdbc {
        # Postgres jdbc connection string to our database, mydb
        jdbc_connection_string => "some string"
        # The user we wish to execute our statement as
        jdbc_user => "User"
        jdbc_password => "Password"
        # The path to our downloaded jdbc driver
        jdbc_driver_library => "/apps/ELK/logstash/driver/ngdbc-2.4.56.jar"
        jdbc_driver_class => "com.sap.db.jdbc.Driver"
         # our query
        #jdbc_validate_connection => true
        #schedule => "* * * * *"
        #record_last_run => true
        # last_run_metadata_path => "login.txt"
        statement => "SELECT
       inputdata.A_LATITUDE, inpudata.A_LONGITUDE, inputdata.B_LATITUDE, 
      inputdata.B_LONGITUDE, outputdata.BANDWIDTH, inputdata.SEQUENCEID, 
      inputdata.REQUESTTIMESTAMP
FROM inputdata, outputdata
WHERE
       inputdata.SEQUENCEID = outputdata.SEQUENCEID
       AND inputdata.REQUEST_TIMESTAMP >= '2019-01-01 00:00:00'
                    AND inputdata.SEQUENCEID IS NOT NULL
       AND inputdata.SEQUENCEID NOT IN ('N/A')
ORDER BY inputdata.SEQUENCEID DESC "
      # jdbc_paging_enabled => "true"
      # jdbc_page_size => "10000"
    }
}

filter {
mutate {
          convert => { "A_LONGITUDE" => "float" }
          convert => { "A_LATITUDE" => "float" }
          convert => { "B_LONGITUDE" => "float" }
          convert => { "B_LATITUDE" => "float" }
      }
mutate {
        rename => {
        "A_LONGITUDE" => "[geo_point1][lon]"
        "A_LATITUDE" => "[geo_point1][lat]"
  }
  }
mutate {
        rename => {
        "B_LONGITUDE" => "[geo_point2][lon]"
        "B_LATITUDE" => "[geo_point2][lat]"
  }
  }
}

output {
  elasticsearch {
    hosts => ["http://some server"]
    index => "cc-test"
    #document_type => "system_logs"
user => "Username"
password => "Password"
  }
  stdout { codec => rubydebug }
}

Don't understand what is wrong with the Filter part and why data is not getting into the columns geo_point1 and geo_point2!!
Somebody please help :pray::pray::pray:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.