Kibana V.7.0 RC 2 - Couldn't find any index patterns with geospatial fields or / No Compatible Fields

Hi Devs and folks,

I have successfully installed V.7 RC 2 :=). I have softflowd activated on my pfsense. Now I recieve the results in Kibana and I can see all the field. So far it looks good to me.
It's probably a basic understanding problem :frowning:

Visualize -> Coordinate Map ->
I recieve this error:

Couldn't find any index patterns with geospatial fields or /

On the new Kibana Maps on the left menu i recieve on the right side:

No Compatible Fields: The netflow* index pattern does not contain any of the following field types: geo_point

JSON in Kibana:

{
  "_index": "netflow-2019.04.07",
  "_type": "_doc",
  "_id": "JF0h-WkBEw0MslqC5mx1",
  "_version": 1,
  "_score": null,
  "_source": {
    "geoip": {
      "longitude": XX6.9783,
      "country_name": "South Korea",
      "ip": "XXX.XXX.223.150",
      "city_name": "Seoul",
      "latitude": XX.5985,
      "timezone": "Asia/Seoul",
      "region_name": "Seoul",
      "as_org": "Korea Telecom",
      "autonomous_system": "Korea Telecom (4766)",
      "continent_code": "AS",
      "country_code3": "KR",
      "region_code": "11",
      "location": {
        "lat": XX.5985,
        "lon": XX6.9783
      },
      "asn": 4766,
      "country_code2": "KR"
    },
    "host": "192.168.0.X",
    "geoip_dst": {
      "longitude": 8.2723,
      "country_name": "Switzerland",
      "ip": "XXX.XXX.XXX.87",
      "city_name": "XXXXXX",
      "latitude": XX.0957,
      "timezone": "Europe/Zurich",
      "region_name": "XXXX",
      "postal_code": "XXXX",
      "as_org": "XXXXXX",
      "autonomous_system": "XXXXXX",
      "continent_code": "EU",
      "country_code3": "CH",
      "region_code": "LU",
      "location": {
        "lat": XX.0957,
        "lon": XX.2723
      },
      "asn": 8821,
      "country_code2": "CH"
    },
    "@timestamp": "2019-04-07T18:49:01.000Z",
    "geoip_src": {
      "longitude": XX.9783,
      "country_name": "South Korea",
      "ip": "XXX.212.223.150",
      "city_name": "Seoul",
      "latitude": 37.5985,
      "timezone": "Asia/Seoul",
      "region_name": "Seoul",
      "as_org": "Korea Telecom",
      "autonomous_system": "Korea Telecom (4766)",
      "continent_code": "AS",
      "country_code3": "KR",
      "region_code": "11",
      "location": {
        "lat": XX.5985,
        "lon": XXX.9783
      },
      "asn": XX66,
      "country_code2": "KR"
    },
    "netflow": {
      "protocol": 6,
      "tcp_flag_tags": [
        "SYN",
        "RST",
        "PSH",
        "ACK"
      ],
      "src_locality": "public",
      "flow_seq_num": 2512,
      "flowset_id": 1024,
      "flow_locality": "public",
      "dst_port": 8889,
      "src_port_name": "TCP/51188",
      "output_snmp": 1,
      "first_switched": "2019-04-07T18:46:50.264Z",
      "tcp_flags": 30,
      "protocol_name": "TCP",
      "bytes": 1472,
      "dst_addr": "XXX.XXX.XXX.87",
      "ip_version": "IPv4",
      "tcp_flags_label": "SYN-RST-PSH-ACK",
      "src_port": 51188,
      "last_switched": "2019-04-07T18:46:52.903Z",
      "src_addr": "XXX.XXX.XXX.150",
      "dst_port_name": "TCP/8889 (ddi-tcp-2)",
      "dst_locality": "public",
      "ip_protocol_version": 4,
      "version": "Netflow v9",
      "input_snmp": 1,
      "tos": 0,
      "packets": 8
    },
    "@version": "1",
    "type": "netflow",
    "tags": [
      "__netflow_direction_not_recognized"
    ]
  },
  "fields": {
    "netflow.first_switched": [
      "2019-04-07T18:46:50.264Z"
    ],
    "@timestamp": [
      "2019-04-07T18:49:01.000Z"
    ],
    "netflow.last_switched": [
      "2019-04-07T18:46:52.903Z"
    ]
  },
  "sort": [
    1554662810264
  ]
}

What did I forget?

Maybe it's because of the netflow module which can't be loaded perfectly.

[root@SVGWMA-ELK-TEST-01 ~]# sudo /usr/share/logstash/bin/logstash --modules netflow --setup --path.settings /etc/logstash
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2019-04-08T19:39:10,459][INFO ][logstash.config.source.modules] Both command-line and logstash.yml modules configurations detected. Using command-line module configuration to override logstash.yml module configuration.
[2019-04-08T19:39:10,488][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-04-08T19:39:10,507][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.0.0"}
[2019-04-08T19:39:11,741][INFO ][logstash.config.source.modules] Both command-line and logstash.yml modules configurations detected. Using command-line module configuration to override logstash.yml module configuration.
[2019-04-08T19:39:11,948][INFO ][logstash.config.modulescommon] Setting up the netflow module
[2019-04-08T19:39:12,796][ERROR][logstash.modules.kibanaclient] Error when executing Kibana client request {:error=>#<Manticore::SocketException: Verbindungsaufbau abgelehnt (Connection refused)>}
[2019-04-08T19:39:12,997][ERROR][logstash.modules.kibanaclient] Error when executing Kibana client request {:error=>#<Manticore::SocketException: Verbindungsaufbau abgelehnt (Connection refused)>}
[2019-04-08T19:39:13,217][ERROR][logstash.config.sourceloader] Could not fetch all the sources {:exception=>LogStash::ConfigLoadingError, :message=>"Failed to import module configurations to Elasticsearch and/or Kibana. Module: netflow has Elasticsearch hosts: [\"localhost:9200\"] and Kibana hosts: [\"localhost:5601\"]", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/config/modules_common.rb:108:in `block in pipeline_configs'", "org/jruby/RubyArray.java:1792:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/config/modules_common.rb:54:in `pipeline_configs'", "/usr/share/logstash/logstash-core/lib/logstash/config/source/modules.rb:14:in `pipeline_configs'", "/usr/share/logstash/logstash-core/lib/logstash/config/source_loader.rb:61:in `block in fetch'", "org/jruby/RubyArray.java:2572:in `collect'", "/usr/share/logstash/logstash-core/lib/logstash/config/source_loader.rb:60:in `fetch'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:148:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:96:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:362:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
[2019-04-08T19:39:13,229][ERROR][logstash.agent           ] An exception happened when converging configuration {:exception=>RuntimeError, :message=>"Could not fetch the configuration, message: Failed to import module configurations to Elasticsearch and/or Kibana. Module: netflow has Elasticsearch hosts: [\"localhost:9200\"] and Kibana hosts: [\"localhost:5601\"]", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/agent.rb:155:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:96:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:362:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
[2019-04-08T19:39:13,772][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Can you check the mapping of netflow-2019.04.07?

GET /netflow-2019.04.07/_mapping

At least one of the field types should be mapped as geo_point: https://www.elastic.co/guide/en/elasticsearch/reference/current/geo-point.html

To have every netflow-* index created and automatically have the right mapping for the location field, you'll use some kind of index template that correctly maps every new netflow-* index.

When you point out that the Logstash netflow module is not loading correctly, that could be the reason why. I am not familiar with that module, but maybe it's job is to load the necessary index template into Elasticsearch.

Hi Tim,

Thanks for you reply.

It looks for me that this is the problem. I always had problem with the index template. On "your" website there is often a lack of information about that. Maybe someone should write a blog post about that. Because this is not just a netflow module problem and it's a problem since I know and use ELK ;), so at least from V.5.0. I also never got a correct instruction on how to create them and what are the problem when they are not created like now. Can you help me or send that thread to someone to elastic how has the knowledge? Or is it just that basic and I'm too stupid ?

{
  "netflow-2019.04.07" : {
    "mappings" : {
      "properties" : {
        "@timestamp" : {
          "type" : "date"
        },
        "@version" : {
          "type" : "text",
          "fields" : {
            "keyword" : {
              "type" : "keyword",
              "ignore_above" : 256
            }
          }
        },
        "geoip" : {
          "properties" : {
            "as_org" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "asn" : {
              "type" : "long"
            },
            "autonomous_system" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "city_name" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "continent_code" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "country_code2" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "country_code3" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "country_name" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "dma_code" : {
              "type" : "long"
            },
            "ip" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "latitude" : {
              "type" : "float"
            },
            "location" : {
              "properties" : {
                "lat" : {
                  "type" : "float"
                },
                "lon" : {
                  "type" : "float"
                }
              }
            },
            "longitude" : {
              "type" : "float"
            },
            "postal_code" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "region_code" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "region_name" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "timezone" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            }
          }
        },
        "geoip_dst" : {
          "properties" : {
            "as_org" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "asn" : {
              "type" : "long"
            },
            "autonomous_system" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "city_name" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },
            "continent_code" : {
              "type" : "text",
              "fields" : {
                "keyword" : {
                  "type" : "keyword",
                  "ignore_above" : 256
                }
              }
            },

..... and more is comming like this.

Pastebin: https://pastebin.com/FphfZCTG

No you're not stupid!

We do have a blog written about geoip data, that talks about a few ways of ingestion. If you look at the section titled Logstash, it talks about pretty much what your softflowd (sorry, I'm not familiar with this) is doing.

The "Mapping, for Maps" section talks about what I talked about with you... but a little in-passing: "... you will need to make sure you have an existing template in place".

Edit: forgot the link to the blog post: https://www.elastic.co/blog/geoip-in-the-elastic-stack

Hi Tim,

One problem was probably that my elasticsearch server was running in development mode. I had also the problem with filebeats and getting the sample dashboards in. After a few hours I found the problem and now it's fixed.

But I still can't get the module running correctly.

I tried different commands and added also elasticsearch.hosts. Still no luck to get it running.
But I recieve different errors when I switch the hosts "ip".

sudo /usr/share/logstash/bin/logstash --modules netflow --setup -M netflow.var.input.udp.port=2055 -M netflow.var.elasticsearch.hosts="127.0.0.1:9200" -M netflow.var.kibana.host="127.0.0.1:9200"

sudo /usr/share/logstash/bin/logstash --modules netflow --setup -M netflow.var.input.udp.port=2055 -M netflow.var.elasticsearch.hosts="0.0.0.0:9200" -M netflow.var.kibana.host="0.0.0.0:9200"

sudo /usr/share/logstash/bin/logstash --modules netflow --setup -M netflow.var.input.udp.port=2055 -M netflow.var.elasticsearch.hosts="localhost:9200" -M netflow.var.kibana.host="localhost:9200"

For example this one:

> * [root@SVGWMA-ELK-TEST-01 bin]# sudo /usr/share/logstash/bin/logstash --modules netflow --setup -M netflow.var.input.udp.port=2055 -M netflow.var.elasticsearch.hosts="127.0.0.1:9200" -M netflow.var.kibana.host="127.0.0.1:9200"

> * WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
> * Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
> * [WARN ] 2019-04-11 06:39:09.256 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
> * [INFO ] 2019-04-11 06:39:09.275 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"7.0.0"}
> * [INFO ] 2019-04-11 06:39:10.546 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] modulescommon - Setting up the netflow module
> * [ERROR] 2019-04-11 06:39:11.490 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] kibanaclient - Error when executing Kibana client request {:error=>#<Manticore::UnknownException: Unrecognized SSL message, plaintext connection?>}
> * [ERROR] 2019-04-11 06:39:11.731 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] kibanaclient - Error when executing Kibana client request {:error=>#<Manticore::UnknownException: Unrecognized SSL message, plaintext connection?>}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.