On the same cloud, I have an instance of Logstash 6.6.1 running within a certain Kubernetes pod that I am trying to make it index into an Elasticsearch 8.6.1 running on a Docker container hosted on an VM with IP address 10.85.173.33.
Testing a manual indexing from my laptop was a success:
$ curl -vvk -XPOST 'https://10.85.173.33:9200/_bulk?refresh&pretty' -H 'Content-Type: application/json' -u my_user:my_pass --data-binary "@my_doc.json"
Note: Unnecessary use of -X or --request, POST is already inferred.
* processing: https://10.85.173.33:9200/_bulk?refresh&pretty
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Trying 10.85.173.33:9200...
* Connected to 10.85.173.33 (10.85.173.33) port 9200
* schannel: disabled automatic use of client certificate
* schannel: using IP address, SNI is not supported by OS.
* using HTTP/1.x
* Server auth using Basic with user 'my_user'
> POST /_bulk?refresh&pretty HTTP/1.1
> Host: 10.85.173.33:9200
> Authorization: Basic ZWxhc3RpYzppdGVyZ28yMDIz
> User-Agent: curl/8.2.1
> Accept: */*
> Content-Type: application/json
> Content-Length: 1428
>
} [1428 bytes data]
* schannel: remote party requests renegotiation
* schannel: renegotiating SSL/TLS connection
* schannel: SSL/TLS connection renegotiated
< HTTP/1.1 200 OK
< X-elastic-product: Elasticsearch
< content-type: application/json
< content-length: 450
<
{ [450 bytes data]
100 1878 100 450 100 1428 1058 3359 --:--:-- --:--:-- --:--:-- 4429{
"took" : 77,
"errors" : false,
"items" : [
{
"index" : {
"_index" : "logstash-2024.01.29",
"_id" : "8881-IwBnfFXgAFtvWKK",
"_version" : 1,
"result" : "created",
"forced_refresh" : true,
"_shards" : {
"total" : 1,
"successful" : 1,
"failed" : 0
},
"_seq_no" : 58552,
"_primary_term" : 1,
"status" : 201
}
}
]
}
* Connection #0 to host 10.85.173.33 left intact
I am trying with the following pipeline output output configuration:
output {
elasticsearch {
document_id => "%{document_id}"
hosts => ["https://10.85.173.33:9200"]
user => "my_user"
password => "my_pass"
ssl => true
ssl_certificate_verification => false
index => "logstash-%{+YYYY.MM.dd}"
manage_template => true
template => "/usr/share/logstash/templates/logstash.json"
template_name => "logstash"
template_overwrite => true
retry_on_conflict => 5
cacert => "/usr/share/logstash/config/ca.crt"
}
stdout { codec => rubydebug }
}
Nothing gets to Elasticsearch and no information is logged.
The Logstash log keeps on outputting:
[2024-01-29T12:54:19,828][ERROR][logstash.outputs.elasticsearch] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://10.85.173.33:9200/_bulk"}
I appreciate any hint of how can I make this work.