Logstash logging problem

Hello everyone,
I'm having trouble with logstash when I want to log a CSV file into elasticsearch and my config file is below.

elastic search version:6.5.4
Logstash version:6.6.0

There is that kind of warning
[WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}

and last 3 row is like that;
[2019-04-24T09:47:35,815][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-04-24T09:47:35,841][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-04-24T09:47:37,760][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

input{
file{
path=>"C:\Users\akumas\Desktop\Cars\cars.csv"
start_position=>"beginning"
sincedb_path=>"NULL"
}

}
filter{
csv{
separator =>","

	columns => [ "maker","model","mileage","manufacture_year","engine_displacement","engine_power","body_type","color_slug","stk_year","transmission","door_count","seat_count","fuel_type","date_created","date_last_seen","price_eur" ]

}
mutate{convert => ["mileage", "integer"] }
mutate{convert => ["price_eur", "float"] }
mutate{convert => ["engine_power", "integer"] }
mutate{convert => ["door_power", "integer"] }
mutate{convert => ["seat_count", "integer"] }

}
output{
elasticsearch {
hosts => "localhost"
index => "cars"
document_type=>"solid_cars"
}
stdout{}
}

Does your process stopped or index into elasticsearch with failure tag ?

This is a common warning .That wont affect the process i hope..

these are last rows i think there is no problem but i does not complace process no respond just waits

[2019-04-24T09:47:35,727][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x7f8d1314 run>"}
[2019-04-24T09:47:35,815][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-04-24T09:47:35,841][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-04-24T09:47:37,760][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

can you change your mutate filter code to like this format ,

mutate{convert => {"milege" => "integer"} }

Noo it doesn't workk

Hi Ahmet,

Please run the logstash config and share the error after starting .
Paste the full error if possible

Regards
Nandha

Hello, this is full output, but I can not see any errors.

C:\Users\akumas\Desktop\logstash-6.6.0\bin>logstash -f C:\Users\akumas\Desktop\Cars\logstash_cars.config
Sending Logstash logs to C:/Users/akumas/Desktop/logstash-6.6.0/logs which is now configured via log4j2.properties
[2019-04-24T13:23:27,578][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-04-24T13:23:27,654][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.6.0"}
[2019-04-24T13:23:48,921][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"cars", id=>"2306a9f312e723e483c65baa0f0891e8e4faadc85e2b0ed07ac630386450dc41", hosts=>[//localhost], document_type=>"sold_cars", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_50a9c1fe-8203-47e3-a514-8e4316ac8330", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>false, ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2019-04-24T13:23:51,453][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-04-24T13:23:52,042][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-04-24T13:24:02,522][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-04-24T13:24:06,468][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-04-24T13:24:06,475][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2019-04-24T13:24:06,533][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2019-04-24T13:24:06,546][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2019-04-24T13:24:06,633][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-04-24T13:24:07,945][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x4b274809 sleep>"}
[2019-04-24T13:24:08,018][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-04-24T13:24:08,029][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-04-24T13:24:09,529][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Use forward slash instead of backslash in the path option on a file input.

The sincedb_path value should be NUL, not NULL.

it's windows slash are okay i changed AS NUL but still same output

Backslashes are not OK in a file input.

You are right, ı got that now;
Sending Logstash logs to C:/Users/akumas/Desktop/logstash-6.6.0/logs which is now configured via log4j2.properties
[2019-04-24T16:26:00,217][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-04-24T16:26:00,458][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.6.0"}
[2019-04-24T16:26:02,973][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, } at line 5, column 22 (byte 114) after input{\n\tfile{\n\t\tpath=>"C:/Users/akumas/Desktop/Cars/cars.csv"\n\t\tstart_position=>"beginning"\n\t\tsincedb_path=>"NUL"", :backtrace=>["C:/Users/akumas/Desktop/logstash-6.6.0/logstash-core/lib/logstash/compiler.rb:41:in compile_imperative'", "C:/Users/akumas/Desktop/logstash-6.6.0/logstash-core/lib/logstash/compiler.rb:49:incompile_graph'", "C:/Users/akumas/Desktop/logstash-6.6.0/logstash-core/lib/logstash/compiler.rb:11:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'", "C:/Users/akumas/Desktop/logstash-6.6.0/logstash-core/lib/logstash/compiler.rb:10:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:149:ininitialize'", "C:/Users/akumas/Desktop/logstash-6.6.0/logstash-core/lib/logstash/pipeline.rb:22:in initialize'", "C:/Users/akumas/Desktop/logstash-6.6.0/logstash-core/lib/logstash/pipeline.rb:90:ininitialize'", "C:/Users/akumas/Desktop/logstash-6.6.0/logstash-core/lib/logstash/pipeline_action/create.rb:42:in block in execute'", "C:/Users/akumas/Desktop/logstash-6.6.0/logstash-core/lib/logstash/agent.rb:92:inblock in exclusive'", "org/jruby/ext/thread/Mutex.java:148:in synchronize'", "C:/Users/akumas/Desktop/logstash-6.6.0/logstash-core/lib/logstash/agent.rb:92:inexclusive'", "C:/Users/akumas/Desktop/logstash-6.6.0/logstash-core/lib/logstash/pipeline_action/create.rb:38:in execute'", "C:/Users/akumas/Desktop/logstash-6.6.0/logstash-core/lib/logstash/agent.rb:317:inblock in converge_state'"]}
[2019-04-24T16:26:05,414][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

What does your configuration look like. When you copy and paste your configuration select the text of the configuration and click on </> in the toolbar above the editing pane so that it is block-quoted.

input{
file{
path=>"C:/Users/akumas/Desktop/Cars/cars1.csv"
start_position=>"beginning"
sincedb_path=>"NUL"log
}
}
filter{
csv {
separator =>","

	columns => [ "maker", "model", "mileage", "manufacture_year", "engine_displacement", "engine_power", "body_type", "color_slug", "stk_year", "transmission", "door_count", "seat_count", "fuel_type", "date_created", "date_last_seen", "price_eur" ]

   } 
mutate {convert => ["mileage", "integer"] }
mutate {convert => ["price_eur", "float"] }
mutate {convert => ["engine_power", "integer"] }
mutate {convert => ["door_power", "integer"] }
mutate {convert => ["seat_count", "integer"] }

}
output{
elasticsearch {
hosts => "localhost"
index => "cars"
document_type=>"sold_cars"
}
stdout{}
}

I don't really understand why can't put my code into what you told me for nice view

Remove the trailing "log"

Thank you so much Badger problem soved!!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.