Logstash not creating indexes for csv file input

Hi All

I'm at a loss I'm trying to process a csv file via logstash. My Config below is not creating any indexes in ES. Looking at the logstash logs it does not look like their is an error in my config. Im adding my config and error log below as i can't seem to spot my error. It would be much appreciated if some with some fresh eye can have a look.

Regards
Emile

My config are as follow:

input {
  # server logs
  # Capacity csv dump

  file {
    path => "/root/grok/ServerCapacity.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }

}

filter {
     csv {
        columns => ["ID","Host","IPAddress","ServerDate","UserPercentage","SysPercentage","WaitPercentage","IdlePercentage","CPUPercentage","PhysicalCPUs","HomeUsage","UsrUsage","VarUsage","TmpUsage","SrvUsage","WeblogicUsage","MercuryUsage","OraclecoreUsage","WebSphereUsage","IBMCoreUsage","MemoryPercentage","RealMemory","MemoryGB","EnsWrite","Environment","ServerMonth","ServerYear"]
        separator => ";"
     }
}

output {
      elasticsearch {
        action => "index"
        hosts => ["telk001zadprh","telk002zadprh","telk003zadprh"]
        index => "logstash-capacity-server-%{+YYYY.MM.dd}"
      }
}

Debug Error log:

[2018-11-22T07:44:05,165][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-11-22T07:44:05,305][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2018-11-22T07:44:05,689][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x774fa6bc run>"}
[2018-11-22T07:44:05,833][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-11-22T07:44:05,919][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2018-11-22T07:44:05,936][DEBUG][logstash.agent           ] Starting puma
[2018-11-22T07:44:05,973][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2018-11-22T07:44:06,055][DEBUG][logstash.api.service     ] [api-service] start
[2018-11-22T07:44:06,294][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-22T07:44:09,944][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-22T07:44:09,945][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-22T07:44:10,739][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x774fa6bc sleep>"}
[2018-11-22T07:44:14,952][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-22T07:44:14,962][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-22T07:44:15,743][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x774fa6bc sleep>"}
[2018-11-22T07:44:19,970][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-22T07:44:19,972][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-22T07:44:20,745][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x774fa6bc sleep>"}
[2018-11-22T07:44:24,978][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-22T07:44:24,981][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-22T07:44:25,746][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x774fa6bc sleep>"}
[2018-11-22T07:44:29,985][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-22T07:44:29,991][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}

Hello,

i think you to change separator => ";" to separator => ","

hope this helps you,

Thanks for the help.

I got it working.

Kind Regards
Emile

Great, Please mark your answer as solution.

Regards,
Balu

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.