Parsing CSV file to Logstash and Elastic Search which is in the server

Hi,

I am a Beginner. Hardly 10 days of experience in Logstash.

I have a csv file (Unstructured). I have copied it to the server, where Logstash is been configured.
I am trying to parse CSV file into Logstash, But I am getting an error. Kinldy check my code below and help me to correct it.

###### Code #######
input {
file {
path => "filepathandname"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["name","product_name","id","name","status","mdisk_count","vdisk_count","capacity","extent_size","free_capacity","virtual_capacity","used_capacity","real_capacity","overallocation","warning","easy_tier","easy_tier_status","compression_active","compression_virtual_capacity","compression_compressed_capacity","compression_uncompressed_capacity","parent_mdisk_grp_id","parent_mdisk_grp_name","child_mdisk_grp_count","child_mdisk_grp_capacity","type","encrypt"]
}
}
output {
elasticsearch {
hosts => "serverip"
index => "host_data"
}
stdout { }
}

I have worked on similar lines I m sharing my code check if this helps

input{
file{
path => "path of the file"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter{
csv{

separator => ","

columns => ["maker", "model", "mileage", "manufacture_year", "engine_displacement", "engine_power", "body_type", "color_slug", "stk_year", "transmission", "door_count", "seat_count", "fuel_type", "data_created", "date_last_seen", "price_eur"]

}
mutate {convert => ["mileage", "integer"]}
mutate {convert => ["price_eur", "float"]}
mutate {convert => ["engine_power", "integer"]}
mutate {convert => ["door_count", "integer"]}
mutate {convert => ["seat_count", "integer"]}
}
output{
elasticsearch {

hosts => "localhost"
index => "cars"
document_type => "sold_cars"
}
stdout{}
}

So this is basically a list of cars with all the various tags as mentioned it works fine try once following the same

1 Like

Thanks. I will try right now. But in your code, I couldn't see a date field. As per my understanding, Logstash and Elastic Search works on recognizing the Date field. Correct me, If I am wrong ?

It is not a mandate it depends on your needs if the date field is very much critical for you then u can opt for it

1 Like

The below is my csv data

'LATITUDE','LONGITUDE'
'13.98','80.34'
'14.98','81.34'
'15.98','82.34'
'16.98','83.34'
'17.98','84.34'
'18.98','85.34'

Now Please find my config file below

input{
file{
path => "/home/xadmin/qwerty/demolog.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter{
csv{
separator => ","
columns => ["LATITUDE", "LONGITUDE"]
}
mutate{convert=>["LATITUDE","float"]
mutate{convert=>["LONGITUDE","float"]
}
output{
elasticsearch{
hosts => "9.xx.xx.xxx:xxxx"
index => "datalong"
document_type=>"longlat"
}
stdout{}
}

** ERROR**
[2018-02-12T07:14:33,653][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 14, column 11 (byte 272) after filter{\n csv{\n\tseparator => ","\n columns => ["LATITUDE", "LONGITUDE"]\n\t}\n mutate{convert=>["LATITUDE","float"]\n mutate", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:incompile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:171:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:inexecute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:335:in block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:inwith_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:332:in block in converge_state'", "org/jruby/RubyArray.java:1734:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:319:in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:inblock in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:inconverge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:343:inblock in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}

try latitude as string because latitude has many points right as in degree,seconds,hour etc.
I guess that is why this error is popping up

In mutate use["LATITUDE","string"]
or else if u do not use mutate also then it will by default take it as string

Thanks for the help. I have done this using Curl command.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.