Unable to ingest CSV file into Elasticsearch through Logstash

C:\Users\ramya.t\logstash-7.3.0\bin>logstash -f C:\Users\ramya.t\logstash-7.3.0\bin\log.conf
Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.runtime.encoding.EncodingService (file:/C:/Users/ramya.t/logstash-7.3.0/logstash-core/lib/jars/jruby-complete-9.2.7.0.jar) to field java.io.Console.cs
WARNING: Please consider reporting this to the maintainers of org.jruby.runtime.encoding.EncodingService
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to C:/Users/ramya.t/logstash-7.3.0/logs which is now configured via log4j2.properties
[2019-08-21T09:53:15,473][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-08-21T09:53:15,493][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.3.0"}
[2019-08-21T09:53:17,020][INFO ][org.reflections.Reflections] Reflections took 51 ms to scan 1 urls, producing 19 keys and 39 values
[2019-08-21T09:53:22,670][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-08-21T09:53:22,988][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-08-21T09:53:23,032][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-08-21T09:53:23,034][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2019-08-21T09:53:23,048][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2019-08-21T09:53:23,082][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-08-21T09:53:23,114][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2019-08-21T09:53:23,117][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x18e2a385 run>"}
[2019-08-21T09:53:23,142][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-08-21T09:53:24,185][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/Users/ramya.t/logstash-7.3.0/data/plugins/inputs/file/.sincedb_51b5ce8e600f3c006b65b5802f40752e", :path=>["C:\Users\ramya.t\Downloads\ELK\AB_FINAL.csv"]}
[2019-08-21T09:53:24,228][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-08-21T09:53:24,276][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-08-21T09:53:24,300][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-08-21T09:53:24,766][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

This is what I am getting in the command prompt New to ELK Stack. Kindly Help.

That looks normal. What is your issue and what does your configuration look like?

Went to Kibana to check if the index was created. It wasn't created.
My config file :point_down:

input {
file {
path => "C:\Users\ramya.t\Downloads\ELK\AB_FINAL.csv"
start_position => "beginning"
}
}

filter {
csv {
separator => ","
columns => [ "USERNAME", "SEQUENCEID", "REQUEST_TIMESTAMP", "A_FLOORSUITE",
"A_BUILDINGNAME", "A_PREMISESNUMBER", "A_STREETNAME", "A_CITYTOWN",
"A_STATE", "A_POSTALZIPCODE", "A_LATITUDE", "A_LONGITUDE",
"A_LONGADDRESS", "A_SITETELEPHONENUMBER", "A_RADIUS", "A_ISHUB",
"A_COLTOPERATINGCOUNTRY", "A_REQUIREDPRODUCT", "A_BANDWIDTH",
"A_CONNECTIVITYTYPE1", "A_CONNECTIVITYTYPE2", "A_CONNECTIVITYTYPE3",
"A_CONNECTIVITYTYPE4", "B_FLOORSUITE", "B_BUILDINGNAME",
"B_PREMISESNUMBER", "B_STREETNAME", "B_CITYTOWN", "B_STATE",
"B_POSTALZIPCODE", "B_LATITUDE", "B_LONGITUDE", "B_LONGADDRESS",
"B_SITETELEPHONENUMBER", "B_RADIUS", "B_ISHUB",
"B_COLTOPERATINGCOUNTRY", "B_REQUIREDPRODUCT", "B_BANDWIDTH",
"B_CONNECTIVITYTYPE1", "B_CONNECTIVITYTYPE2", "B_CONNECTIVITYTYPE3",
"B_CONNECTIVITYTYPE4", "SCHEMAVERSION", "REQUESTTYPE", "RESPONSE",
"RESP_ERRORTYPE", "RESP_ERRORCODE", "STATUS",
"NEARNETSTATUS_AENDRESULT_STATUS", "NEARNETSTATUS_BENDRESULT_STATUS"]
}

}

output {
elasticsearch {
hosts => "localhost"
index => "ebonding"
}
stdout {}
}

Do not use backslash in the path option of a file input, use forward slash.

1 Like

Thank youu!! It's working : )

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.