Logstash not responding when import csv

i am not able to get import the CSV file with below configuration

input{
file{
path => "C:\change_reque07.csv"
start_position => "beginning"
}
}
filter{
csv{
separator => ","
columns => ["company","number","requested_by","u_category","u_rfc","u_device_ref_1","u_service_impact","u_urgency_lead_time","requested_by_date","sys_created_on","closed_at","u_stage","state","u_machx_case_reference","sys_updated_on","sys_updated_by","requested_by.company","sys_created_by","opened_at","opened_by","closed_by","u_machx_success_date","u_change_ack_time","u_change_successful","u_device_1","u_device_ref_2","u_device_2","u_device_ref_3","u_device_3","u_device_ref_4","u_device_4","u_device_ref_5","u_device_5","u_device_ref_6"]
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "cr-02"
}
stdout {}
}

cmd logs is as follows its stuck at the end:

C:\Kibana\logstash-6.2.3\bin>logstash -f logstash.conf
Sending Logstash's logs to C:/Kibana/logstash-6.2.3/logs which is now configured via log4j2.properties
[2018-07-30T15:49:19,204][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Kibana/logstash-6.2.3/modules/fb_apache/configuration"}
[2018-07-30T15:49:19,313][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/Kibana/logstash-6.2.3/modules/netflow/configuration"}
[2018-07-30T15:49:19,858][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-07-30T15:49:21,549][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2018-07-30T15:49:22,377][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-07-30T15:49:29,179][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-07-30T15:49:30,192][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-07-30T15:49:30,214][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-07-30T15:49:30,856][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-07-30T15:49:30,990][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-07-30T15:49:31,001][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-07-30T15:49:31,037][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-07-30T15:49:31,133][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-07-30T15:49:31,247][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2018-07-30T15:49:33,095][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x53cf0cd0 run>"}
[2018-07-30T15:49:33,362][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}

A file input will tail the input forever, waiting for new lines to be appended. So this looks normal.

start_position only has any effect the first time you run an input with that configuration. Adding this to the file input might help

sincedb_path => "NUL"
1 Like

Did your config file is added to logstash.yml ?

no, is it compalsory?

The following error i got here after add >> sincedb_path => "NUL"

[2018-07-31T11:51:35,253][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}
[2018-07-31T11:51:43,542][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"cr-02", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x65f3db4], :response=>{"index"=>{"_index"=>"cr-02", "_type"=>"doc", "_id"=>"xkb_7mQB4saQgjbI3siQ", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [requested_by] of different type, current_type [text], merged_type [ObjectMapper]"}}}}

also

[2018-07-31T16:44:17,160][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"cr-011", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x7222c5b0], :response=>{"index"=>{"_index"=>"cr-011", "_type"=>"doc", "_id"=>"XsEL8GQB4827YA68s15F", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Can't merge a non object mapping [requested_by] with an object mapping [requested_by]

That's a good thing, it indicates it is reading events from the file and trying to index them into elasticsearch.

If you add 'output { stdout { codec => rubydebug } }' to your logstash configuration, what does the 'requested_by' field look like. Also, if you go to the Discover pane in Kibana and look at a document that has already been indexed, what does the 'requested_by' field look like? Copy and past it from the JSON tab on an expanded document.

Some times

Please add your filter config file's path to logstash.yml as mentioned below

path.config: /home/logstash/logstash-6.2.2/config/myfilter.config

myfilter.config is your filter file and it is placed in config folder of logstash

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.