Can I upload csv IPs to elastic, and then use it in Grafana


I have CSV file with some datasets like that:

AT,Austria;; 6
AT,Austria;; 1
AT,Austria;; 7004
BR,Brazil;; 1
BR,Brazil;; 1
BR,Brazil;; 1
CA,Canada;; 1
CA,Canada;; 1
CH,Switzerland;; 1399
CH,Switzerland;; 1284
CL,Chile;; 26
CL,Chile;; 15223
CN,China;; 1
CN,China;; 1
CN,China;; 1

and more 4000 lines...

Main aim, to add these IPs to elastic. Then connect DB Elasticsearch to Grafana to see all these IPs on the map. I tied to solve this issue almost one week, and can find correct solution.

Also I tried to create logstash-geoip.conf with GROK, but this config does not work :frowning:

Could you please help me with advice? Thank you

Hello @unico Welcome to the community.

Apologies this seemed so hard. Your CSV need a little parsing.
Plus with geo_point data you need to define a mapping (data type) first.

In Kibana -> Dev Tools

Create a mapping important because geo_point needs to be defined ahead of time.

PUT my-discuss-connexions/
  "mappings": {
    "properties": {
      "connexion": {
        "type": "long"
      "port": {
        "type": "long"
      "ip": {
        "type": "ip"
      "pais": {
        "type": "keyword"
      "pais_code": {
        "type": "keyword"
      "geoip": { <!--- This is the important part 
        "properties": {
          "location": {
            "type": "geo_point"


Your IPs would probably not GEOIP because the port was attached.

# discuss-csv-ip-data.csv
input {
  file {
    path => "/Users/sbrown/workspace/sample-data/discuss/csv-ip-data.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"

filter {
    dissect {
      mapping => {
        "message" => "%{pais_code},%{pais}; %{ip}:%{port}; %{connexion}"
    geoip {
      source => "ip"

output {
  stdout {codec => rubydebug}
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "my-discuss-connexions"

$ sudo /Users/sbrown/workspace/elastic-install/7.13.0/logstash-7.13.0/bin/logstash -r -f ./discuss-csv-ip-data.conf

Sample output

      "@version" => "1",
       "message" => "CH,Switzerland;; 1399",
     "pais_code" => "CH",
          "path" => "/Users/sbrown/workspace/sample-data/discuss/csv-ip-data.csv",
         "geoip" => {
         "country_code3" => "CH",
        "continent_code" => "EU",
              "location" => {
            "lon" => 8.1551,
            "lat" => 47.1449
                    "ip" => "",
              "latitude" => 47.1449,
              "timezone" => "Europe/Zurich",
             "longitude" => 8.1551,
         "country_code2" => "CH",
          "country_name" => "Switzerland"
    "@timestamp" => 2021-06-18T14:58:25.682Z,
     "connexion" => "1399",
          "port" => "143",
          "host" => "ceres",
          "pais" => "Switzerland",
            "ip" => ""

Create Index Pattern If you choose to use the time fields you will need to be aware of that on the map...

Add to Map in Kibana (Sorry I am not a Grafana Expert)

1 Like


Thank yo so much for your answer! This is really works!!!

One more questions, when I came to work at this morning, my Kibana does not work, but yesterday was all good and worked.
Now I have error:

root@geoip4g:/usr/share/logstash/bin# ./logstash -r  -f /etc/logstash/conf.d/logstash-geoip.conf
Using bundled JDK: /usr/share/logstash/jdk
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/ Using default config which logs errors to the console
[INFO ] 2021-06-23 10:00:42.433 [main] runner - Starting Logstash {"logstash.version"=>"7.13.2", "jruby.version"=>"jruby (2.5.7) 2021-03-03 f82228dc32 OpenJDK 64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [linux-x86_64]"}
[WARN ] 2021-06-23 10:00:43.275 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[FATAL] 2021-06-23 10:00:43.300 [LogStash::Runner] runner - Logstash could not be started because there is already another instance using the configured data directory.  If you wish to run multiple instances, you must change the "" setting.
[FATAL] 2021-06-23 10:00:43.306 [LogStash::Runner] Logstash - Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
        at org.jruby.RubyKernel.exit(org/jruby/ ~[jruby-complete-]
        at org.jruby.RubyKernel.exit(org/jruby/ ~[jruby-complete-]
        at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:89) ~[?:?]

Also, I have an error in browser:

Hi @unico

Good to hear the Geo IP works.

  1. You are in the bin directory you need to run Logstash from it's base directory.


Also it looks like trying to run more than 1 instance.

  1. The info above is about logstash not Kibana so I do not know why Kibana is failing.

  2. if Kibana is failing please open a separate thread on that with a specific subject and all pertinent info. Kibana.yml, startup command startup logs

  3. If Logstash keeps failing do the same open a new thread.

Ok, I see.

I need a little bit more of your professional help. What should I do, if I have not only one file, but for example 10? I should to create *.conf file in folder /etc/logstash/conf.d for each file with IPs?

C:\Users\soporte\Downloads\informe_21-04-21_26-04-21\ ips

C:\Users\soporte\Downloads\informe_21-04-21_26-04-21\ total

Take a look at the docs here

For all the files of the same type you can use path like.

"path" => "/Users/sbrown/workspace/sample-data/**/*.csv

You will probably need to make different confs for different file types / structure, that would make more sense.

Also be aware of this setting

sincedb_path => "/dev/null"

That means do not keep track of which files I have loaded, once you are ready you would take that line out and then it will only load each file once, and new ones when the arrive.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.