Created a new config file, no errors but not loaded in Kibana

I have a csv file and I created a new configuration file for Logstash:

input {
file {
path => "/Users/naka/Documents/exploringELK/cims1819.csv"
type => "csv"
start_position => "beginning" } }
filter { csv { columns => ["ReferenceID","Vtype","MasterRefID","Branch","PostedBy","LPName","Scity","Sstate","Dcity","Dstate","TonnageRequested","TonnagePlaced", "LPRate" ,"SalesQuoteID","L1Name","L1BidQuotedBy","L1Amount","L1Margin","L1ATH","L1BidingDateTime","L2Name","L2BidQuotedBy","L2Amount","L2Margin","L2ATH","L2BidingDateTime","L3Name","L3BidQuotedBy","L3Amount","L3Margin","L3ATH","L3BidingDateTime","L4Name","L4BidQuotedBy","L4Amount","L4Margin","L4ATH","L4BidingDateTime","BookingID"] separator => "," } }
output {
elasticsearch { hosts => ["localhost:9200"] }
}

I don't get any error messages when I launch with

logstash -f cimsLogstash.conf

However I don't get the prompt back on my terminal and the csv file doesn't get loaded in Kibana.

This is just my second day with ELK.

Output from the command:

Sending Logstash logs to /usr/local/Cellar/logstash/6.7.0/libexec/logs which is now configured via log4j2.properties
[2019-05-13T16:43:15,857][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-05-13T16:43:15,887][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.7.0"}
[2019-05-13T16:43:26,836][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-05-13T16:43:27,474][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-05-13T16:43:27,787][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-05-13T16:43:27,874][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-05-13T16:43:27,878][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2019-05-13T16:43:27,919][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-05-13T16:43:27,922][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["//localhost:9200"]}
[2019-05-13T16:43:27,948][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-05-13T16:43:28,026][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
[2019-05-13T16:43:28,431][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/local/Cellar/logstash/6.7.0/libexec/data/plugins/inputs/file/.sincedb_38a4ab446ffffd93d60848ae887dc785", :path=>["/Users/naka/Documents/exploringELK/cims1819.csv"]}
[2019-05-13T16:43:28,509][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x732fb2e5 run>"}
[2019-05-13T16:43:28,590][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-05-13T16:43:28,603][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-05-13T16:43:29,110][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

After I press control-c to terminate process:

^C[2019-05-13T16:45:54,790][WARN ][logstash.runner ] SIGINT received. Shutting down.
[2019-05-13T16:45:55,000][INFO ][filewatch.observingtail ] QUIT - closing all files and shutting down.
[2019-05-13T16:45:56,395][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x732fb2e5 run>"}

Try adding

sincedb_path => "/dev/null"

to your file input. If that does not help run with "--log.level trace" and see what filewatch has to say.

Thanks! That and adding a index name helped! Now I need to figure out how to visualise the data I have uploaded :sweat_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.