Layer dot found result

Hello everyone,
I am having trouble viewing points in my kibana map, It says No results found , I have made sure the data is in geo.point format, so I have no clue what the problem could be. below is an image of the problem and the data in discover.

{
  "docvalue_fields": [],
  "size": 10000,
  "track_total_hits": 10001,
  "_source": false,
  "fields": [
    "location"
  ],
  "script_fields": {},
  "stored_fields": [
    "location"
  ],
  "runtime_mappings": {},
  "query": {
    "bool": {
      "must": [],
      "filter": [
        {
          "bool": {
            "must": [
              {
                "exists": {
                  "field": "location"
                }
              },
              {
                "geo_bounding_box": {
                  "location": {
                    "top_left": [
                      8.4375,
                      37.71859
                    ],
                    "bottom_right": [
                      11.953129999999987,
                      36.03133
                    ]
                  }
                }
              }
            ]
          }
        }
      ],
      "should": [],
      "must_not": []
    }
  }
}

What version? How installed?
I ask as there is / was a bug depending on these answer.

Further down that settings screen try the 2nd clustering setting

I used version 8.0
and the 2nd clustering setting you talk about Tooltip fields?

You did not answer your installation method and I was referring to the Scaling method select something other than "Use vector tiles"

I performed the installation of the whole stack on windows

Did you try the different scaling method?

Limit results to 10,000

Or

Show clusters when results exceed 10,000

Perhaps take a look at

i have changed the scaling method but no visualization the same issue

Then then pretty much need to back up to the beginning

Show us the mapping (schema)
Show us a couple sample documents.
Also how you added the layer.

Also, you could just simply add the sample data either the weblog and you can check that. See if it also displays.

8.0.1 kibana_sample_data_logs added as a document ... layer shows fine... so it is most likely in your mapping (schema) / and or data...

@ 5k_pwc

It looks like the problem is with your layer "visibility" settings. In the screen shot provided in your post, the map is at zoom level 2.34 but your layer "visibility" is set to only show the layer between zoom ranges 10 and 24. Change layer visibility to be between 0 and 24 to show data at all zoom levels.

2 Likes

Wow @Nathan_Reese You have eagle eyes.. I looked right that and did not see it!

I tried to change layer visibility to be between 0 and 24 ,any modification the same probleme,

km (2)

a preview of the geometric fields in postgres
dfgj

Hi @5k_pwc the Postgres field does not help us.. that is not how the data is stored in elasticsearch.

So I think we are back to these questions

How did you load the data?
Show us the mapping (schema) which looks right you have a geopoint.
Show us a couple sample documents from Elasticsearch

GET /es-*/_search

at the beginning I created my empty index then I loaded the data with logstash 8.0
then the mapping like this
Capture d’écran 2022-08-13 234033
in my file log i named the filed geo with location but in mapping i can't see him only in dataview i can like this


and coordinates like this

my file.log contains

mutate {
    rename => { "LAT" => "[location][lat]" }
    rename => { "LON" => "[location][lon]" }
}

hope i was able to answer your questions?

Thanks but, that still does not show us what data actually stored in elasticsearch and can not tell by that screenshot (BTW screenshot of text is highly discouraged as it can not be searched , debugged etc..etc.)

The map will only work on the geo_point (or other geo_* types) the coordinates field are not read by the map.

AND OH is there a timefield associated with this data? It looks like you do not have a time field but just asking..

Go to Kibana -> Dev Tools run this? Do you see any results? Do you see any results with the location field populated?

This is important to see what your location data looks like!

And actually you should be able to go to Kibana -> Dev Tools and run the query you saw in inspect on the Map Page... and see what it returns.. example below ...

Just copy the query in right below the GET /es-*/_search line this is from your examples...

GET /es-*/_search
{
  "docvalue_fields": [],
  "size": 10000,
  "track_total_hits": 10001,
  "_source": false,
  "fields": [
    "location"
  ],
  "script_fields": {},
  "stored_fields": [
    "location"
  ],
  "runtime_mappings": {},
  "query": {
    "bool": {
      "must": [],
      "filter": [
        {
          "bool": {
            "must": [
              {
                "exists": {
                  "field": "location"
                }
              },
              {
                "geo_bounding_box": {
                  "location": {
                    "top_left": [
                      8.4375,
                      37.71859
                    ],
                    "bottom_right": [
                      11.953129999999987,
                      36.03133
                    ]
                  }
                }
              }
            ]
          }
        }
      ],
      "should": [],
      "must_not": []
    }
  }
}
{
  "took" : 1,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 719,
      "relation" : "eq"
    },
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "jeudi",
        "_id" : "2046",
        "_score" : 1.0,
        "_source" : {
          "ygps" : 0,
          "adressenum" : "67747",
          "@timestamp" : "2022-08-13T21:36:02.557252300Z",
          "numext" : null,
          "id" : "2046",
          "nom" : null,
          "detail" : "CASSECONDPROV_AVGRILLAGEAVERT",
          "datecons" : "2015/12/01 00:00:00.000",
          "commune" : "1117",
          "doccons" : null,
          "secteur_e" : "30515033",
          "commcons" : null,
          "zgps" : null,
          "volume_per" : 0,
          "secteur_ad" : "117151",
          "gouvern" : "11",
          "agent" : null,
          "nombre_ab" : null,
          "directio_1" : "60100000",
          "arrond_ar" : "111712",
          "adresse" : null,
          "etat" : "REPAREE",
          "detectio" : "APPEL",
          "arrond" : "111712",
          "n_invent" : null,
          "cause" : "INTERVHUM",
          "typedefa" : "PERCE",
          "commune_ar" : "1117",
          "type" : "Point",
          "adressecod" : "1117",
          "docrepa" : null,
          "xgps" : 0,
          "coordinates" : [
            1144140.75,
            4428199.28
          ],
          "techrepa" : "REPARE",
          "cadran" : null,
          "division" : null,
          "priorite" : "IMMEDIAT",
}}}

the problem is that the coordinates field I did not create but rather location that I create them with geopoint format but is empty does not contain data

Hi @5k_pwc Yup BTW that is not the mapping that is an actual document .. :slight_smile:

to get the mapping / schema just run
GET /jeudi without the _search

but that is perfect and as you state... you have no actual location field / data .. there lies the problem! :frowning:

So yes you have a mapping for location (schema / field) ... but you are not actually filling in the location data into the location field you have an error in your logstash configuration it is is not properly loading the location field.

Sooooo you will need to provide your entire logstash conf and a couple samples of data if you want help, all in text.. anonymize what you need to.

tl;dr you got a bug in your logstash pipeline configuration.

A simple way to debug logstash is put the stdout debugger at the end in the output section

output {
....
  stdout { codec => rubydebug }

}

ok this is my logstash conf

input {
    jdbc {
jdbc_driver_library => "C:\Users\Lenovo\Desktop\ELK\logstash-8.1.0\logstash-core\lib\jars\postgresql-42.4.0.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://localhost/****"
        jdbc_user => "****"
        jdbc_password => "*****"
        statement => 'SELECT id, st_asgeojson(geom) as geom, nom, typeano, etat, adresse,  createur FROM "TABLENAME"'
  
    }
}

filter {
  mutate { rename => { "data" => "additionalData" }
}
  
  #parse geom field as json,
  #geom => {"type":"Point","crs":{"type":"name","properties":{"name":"EPSG:3857"}},"coordinates":[10,11]}
  json { source => "geom" }
  
  #remove unwanted fields
  mutate { remove_field => ["crs", "path", "@version", "host", "message", "geom"] }
  
  #prepare geo_point field
 # mutate { rename => {"coordinates" => "[location][coordinates]"}}
    #mutate { rename => {"type" => "[location][type]"}}
mutate {
    rename => { "LAT" => "[location][lat]" }
    rename => { "LON" => "[location][lon]" }
}
# mutate { replace => {"location" => "POINT(%{lon} %{lat})" }
}


output {
stdout {  
        codec => json_lines  
    } 
elasticsearch {
hosts => ["http://localhost:9200"]
index => "jeudi"
user => "**********"
password => "********"
document_id =>"%{id}"
}
}

knowing that the geom field is the geometric field, and the other characters

I see you have been trying to debug :slight_smile:

To help I need to see the raw input and output.

Please comment out / remove everything in the filter section and just read the input and show me the output

Sometime codec json_lines can cause some issues I don't recommend it.

Can you use please use this and just read 1 or 2 inputs and provide the output.

  stdout { codec => rubydebug }

example

input {
    jdbc {
      jdbc_driver_library => "C:\Users\Lenovo\Desktop\ELK\logstash-8.1.0\logstash-core\lib\jars\postgresql-42.4.0.jar"
      jdbc_driver_class => "org.postgresql.Driver"
      jdbc_connection_string => "jdbc:postgresql://localhost/****"
      jdbc_user => "****"
      jdbc_password => "*****"
      statement => 'SELECT id, st_asgeojson(geom) as geom, nom, typeano, etat, adresse,  createur FROM "TABLENAME"'
    }
}

output {
  stdout { codec => rubydebug }
}
1 Like