Data does not get loaded into ElasticSearch

I am trying to load into my AWS ElasticSearch module from my EC2 instance but am not able to load any data into my ES - am very new to this and any help is highly appreciated.

I created the index below in Kibana:
PUT logstash-trades
{
"settings":{
"number_of_shards":1,
"number_of_replicas":1
},
"mappings":{
"tracetrddoc1":{
"properties":{
"cusip":{
"type":"text"
},
"reportingpartyside":{
"type":"text"
},
"tradedate":{
"type":"date"
},
"marketsegment":{
"type":"integer"
},
"sectorid":{
"type":"integer"
}
}
}
}
}
Below is my logstash config file:
input {
file {
path => "/home/ec2-user/es_data/temptrds.csv"
start_position => "beginning"
}
}
filter {
csv {
separator => "|"
columns => [ "issuepriceid", "tradedate", "tradeweek", "trademonth", "tradequarter", "tradeyear", "sequencenumber", "reportingpartyside", "effectivedate", "price", "yield", "spread", "estimatedquantity", "yearstomaturity", "marketspread", "issueid", "cusip", "liquidityscore", "maturity", "issuedate", "paymenttype", "interestratetype", "paymentfrequency", "redemption", "countrycode", "marketsegment", "snprating", "moodyrating", "isocurrencycode", "sectorid", "issueamount", "amountoutstanding", "callable", "putable", "sinkable", "makewhole", "tracecode", "industry", "parent_party_id", "isinvestmentgrade", "productid"]
}
mutate {convert => ["issuepriceid", "integer"]}
mutate {convert => ["sequencenumber", "integer"]}
mutate {convert => ["price", "float"]}
mutate {convert => ["yield", "float"]}
mutate {convert => ["spread", "float"]}
mutate {convert => ["estimatedquantity", "float"]}
mutate {convert => ["yearstomaturity", "integer"]}
mutate {convert => ["marketspread", "float"]}
mutate {convert => ["issueid", "integer"]}
mutate {convert => ["liquidityscore", "integer"]}
mutate {convert => ["marketsegment", "integer"]}
mutate {convert => ["snprating", "integer"]}
mutate {convert => ["moodyrating", "integer"]}
mutate {convert => ["isocurrencycode", "integer"]}
mutate {convert => ["sectorid", "integer"]}
mutate {convert => ["issueamount", "float"]}
mutate {convert => ["amountoutstanding", "float"]}
mutate {convert => ["productid", "integer"]}
}

output {
elasticsearch {
hosts => ["https://vpc-xxx.us-east-1.es.amazonaws.com:443"]
index => "logstash-trades"
}
}

After running the command: sudo bin/logstash -f /home/ec2-user/es_data/logstash2.config --path.settings /etc/logstash

When i check the number of documents in my index by GET /_cat/indices?v - i see 0 docs.

AWS ES does as far as I know not work with the elasticsearch output plugin, so I believe you need to install and use the amazon_es output plugin.

Thanks so much for your response.

I installed plugin: sudo bin/logstash-plugin install logstash-output-amazon_es

And changed my output as well to:
output {
amazon_es {
hosts => ["https://xxx.us-east-1.es.amazonaws.com:443"]
index => "logstash-trades"
}
}
But am getting error below
[2018-10-19T17:50:04,202][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#LogStash::OutputDelegator:0x26117f44:0x1378162c run>"}

[2018-10-19T17:50:04,207][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<NoMethodError: undefined methole/jruby/2.3.0/gems/aws-sdk-core-2.11.136/lib/aws-sdk-core/signers/v4.rb:45:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-oudapter.rb:111:inperform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-6.4.0-java/lib/logstash/outputs/amazon_es/r/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-6.4.0-java/lib/logstash/outputs/amazon_es/http_client/pool.rb:245:in block in healthcheck!'", "org/jrubyems/logstash-output-amazon_es-6.4.0-java/lib/logstash/outputs/amazon_es/http_client/pool.rb:241:inhealthcheck!'", "/usr/share/logstash/vendor/bundle/jruby/ttp_client/pool.rb:341:in update_urls'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-6.4.0-java/lib/logstash/outputs/amazoy/2.3.0/gems/logstash-output-amazon_es-6.4.0-java/lib/logstash/outputs/amazon_es/http_client.rb:316:inbuild_pool'", "/usr/share/logstash/vendor/bundle/jrub/http_client.rb:66:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-6.4.0-java/lib/logstash/outputs/amazon_es//bundle/jruby/2.3.0/gems/logstash-output-amazon_es-6.4.0-java/lib/logstash/outputs/amazon_es/http_client_builder.rb:98:inbuild'", "/usr/share/logstash/vendputs/amazon_es.rb:253:in build_client'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-amazon_es-6.4.0-java/lib/logstash/outputs/amazoyExt.java:102:inregister'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:46:in register'", "/usr/share/logstash/logstash-core/lib/logsib/logstash/pipeline.rb:253:inblock in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipelsh/pipeline.rb:594:inmaybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:263:in start_workers'", "/usr/share/logstash/lsh-core/lib/logstash/pipeline.rb:160:inblock in start'"], :thread=>"#<Thread:0x1378162c run>"}

[2018-10-19T17:50:04,228][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :messse", :backtrace=>nil}

I suspect you may need to also add credentials, but have never used that plugin myself.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.