Sample project

Hi All,

I am unable to add any json file into my elasticsearch. kindly help me no datas are reflecting to kibana.

my logstash config file:
input{
file{
path => "D:\ELK-6.4\logs.jsonl\FL_insurance_sample.csv"
start_position => "beginning"
sincedb_path => "/dev/nul"
}
}
filter{
csv{
separator => ","
columns =>["Cust_ID","Cust_Fname","Cust_Lname","Cust_Email","Cust_City","Cust_Type"]
}
mutate{convert => ["Cust_ID","integer"]}
}
output{
elasticsearch{
hosts => "localhost"
index => "customers"
document_type => "US_Based_Cust"
}
stdout{}

 }

Output :
D:\ELK-6.4\logstash-6.4.0\bin>logstash -f D:\ELK-6.4\logstash-6.4.0\config\logstash-sample.conf
Sending Logstash logs to D:/ELK-6.4/logstash-6.4.0/logs which is now configured via log4j2.properties
[2018-09-10T16:36:38,258][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-09-10T16:36:39,127][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.0"}
[2018-09-10T16:36:41,522][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"customers", id=>"4ba64e70f42580f567145d66b0aa54a5509102dbd46464292b1f15b83c629fce", hosts=>[//localhost], document_type=>"US_Based_Cust", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_55988f44-8684-4a6c-b88b-6c24b6a6aba9", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-09-10T16:36:42,897][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-09-10T16:36:43,249][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-09-10T16:36:43,257][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-09-10T16:36:43,384][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-09-10T16:36:43,427][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-09-10T16:36:43,432][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-09-10T16:36:43,459][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2018-09-10T16:36:43,481][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-09-10T16:36:43,499][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-09-10T16:36:43,887][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x1d3d66ec run>"}
[2018-09-10T16:36:43,953][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-09-10T16:36:43,974][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-09-10T16:36:44,285][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Kindly anyone help me.

sincedb_path => "/dev/nul"

On Windows use "nul", not "/dev/nul".

document_type => "US_Based_Cust"

Index names must be lowercase only.

Thanks for quick response.
I've changed the conf as per your comments:
input{
file{
path => "D:\ELK-6.4\logs.jsonl\FL_insurance\FL_insurance_sample.csv"
start_position => "beginning"
sincedb_path => "nul"
}
}
filter{
csv{
separator => ","
columns =>["Cust_ID","Cust_Fname","Cust_Lname","Cust_Email","Cust_City","Cust_Type"]
}
mutate{convert => ["Cust_ID","integer"]}
}
output{
elasticsearch{
hosts => "localhost"
index => "customers"
document_type => "us_based_customer"
}
stdout{}

}
OUTPUT is:
D:\ELK-6.4\logstash-6.4.0\bin>logstash -f apache_logstash.conf
Sending Logstash logs to D:/ELK-6.4/logstash-6.4.0/logs which is now configured via log4j2.properties
[2018-09-24T12:54:10,321][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-09-24T12:54:10,757][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.0"}
[2018-09-24T12:54:12,632][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"customers", id=>"568bf95facfde7036131f90f2332ec269412ad64f88084fe77c0c8e21d6b6a4c", hosts=>[//localhost], document_type=>"us_based_customer", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_807cf569-a7cb-481d-bb15-e173baec3840", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-09-24T12:54:13,898][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-09-24T12:54:14,243][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2018-09-24T12:54:14,258][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-09-24T12:54:14,383][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-09-24T12:54:14,419][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-09-24T12:54:14,419][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-09-24T12:54:14,451][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2018-09-24T12:54:14,476][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-09-24T12:54:14,492][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-09-24T12:54:14,836][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x690d3714 sleep>"}
[2018-09-24T12:54:14,882][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2018-09-24T12:54:14,898][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-09-24T12:54:15,117][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Kibna:

Hi, can you please anybody give me any sample project flow.

It's not clear to me what the problem is.

Note that Kibana is looking at data from indexes matching customer* but you've configured Logstash to write to the us_based_customer index.

Thanks for your reply.

I removed the "document_type" and "sincedb_path" and its working fine.

I am able to view the records in Kibana.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.