Logstash cannot upload to https Elasticsearch

Hi everyone,

I'm having troubles on uploading data from a csv file to https Elasticsearch, using Logstash.

This is the logstash.conf configured:

input {
    file {
        path => "${pwd}/some-metrics.csv"
        file_completed_action => "log"
        file_completed_log_path => "${pwd}/done.log"
        sincedb_path => "/dev/null"
        start_position => beginning
    }
}
filter {
    csv {
        columns => ["applicationIdentification", "applicationDescription", "applicationName"]
        skip_header => true
        separator => ","
    }
}
output {
  elasticsearch {
    hosts => ["https://elasticsearch-example.com:443"]
    index => "some-metrics-%{+YYYY.MM}"
    user => "username"
    password => "pw"
    ssl_enabled => true
    ssl_certificate_authorities => '${pwd}/cacerts/elasticsearch-example.com.cer'
  }
}

The Logstash during execution output following errors:

[INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://elasticsearch-example.com:443"]}
[INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://****:xxxxxx@elasticsearch-example.com:443/]}}
[ERROR][logstash.outputs.elasticsearch][main] Unable to retrieve Elasticsearch version {:exception=>LogStash::Json::ParserError, :message=>"Unexpected character ('<' (code 60)): expected a valid value (JSON String, Number, Array, Object or token 'null', 'true' or 'false')\n at [Source: (byte[])\"<!DOCTYPE html><html lang=\"en\"><head><meta charSet=\"utf-8\"/><meta http-equiv=\"X-UA-Compatible\" content=\"IE=edge,chrome=1\"/><meta name=\"viewport\" content=\"width=device-width\"/><title>Elastic</title><style>\n        \n        @font-face {\n          font-family: 'Inter';\n          font-style: normal;\n          font-weight: 100;\n          src: url('/ui/fonts/inter/Inter-Thin.woff2') format('woff2'), url('/ui/fonts/inter/Inter-Thin.woff') format('woff');\n        }\n\n        @font-face {\n          font-f\"[truncated 191025 bytes]; line: 1, column: 2]"}
[ERROR][logstash.javapipeline    ][main] Pipeline error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: Could not connect to a compatible version of Elasticsearch>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.19.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:277:in `block in healthcheck!'", "org/jruby/RubyHash.java:1587:in `each'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.19.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:262:in `healthcheck!'", ...
[...]

This is the command used to start Logstash:

/usr/share/logstash/bin/logstash --path.config ${pwd}/logstash.conf --path.logs /tmp/logstash.log --path.settings /etc/logstash/ --path.data ${pwd}

Logstash, Elasticsearch and Kibana have the same version, namely 8.11.1

The Elasticsearch and Kibana (ELK Stack) is on http port served, but with nginx reverse proxied to https.

The strange is, that if I want to upload with http it works, but not with https.

Can anyone help me?

Thanks a lot.

Best regards,
Gabriele

I would do:

  • make sure that your logstash.yml, pipeline.yml and logstash.conf are correct syntax as well the ${pwd} variable has the correct value
  • use directory not a file: --path.logs /tmp/ and make sure that is writable to the user which run LS
  • check your LS version, because you have the error: Could not connect to a compatible version of Elasticsearch
  • put under doubled quotes and make sure that the user has rights
    ssl_certificate_authorities => "${pwd}/cacerts/elasticsearch-example.com.cer"

Hi Rios,

The ${pwd} variable is configured with the current path and the others Logstash files are correctly written, otherwise also in http upload it wouldn't be worked.

I created on Elasticsearch a logstash user with following cluster privileges manage_index_templates, monitor, manage_ilm and manage_logstash_pipelines and following index privileges write, create, create_index, manage and manage_ilm.

This user works with http, for https are needed more privileges to be added?

I also verified and changed the other recommandations you wrote me, but it didn't also work (the Logstash output log is the same with errors).

Thanks :slightly_smiling_face:

Gabriele

If http works, check this param, does your user which is running LS has rights. Be aware the LS service is running under "logstash" user. Since nginx is in the front, it must trust the same CA.

For http these two parameters in logstash.conf file ssl_enabled => true and ssl_certificate_authorities => "${pwd}/cacerts/elasticsearch-example.com.cer" are not needed.

The LS user probably works, otherwise on http shouldn't have worked.

The certificate, as written in ssl_certificate_authorities is that one configured on nginx configuration file, so it should be trusted.

Could be the Logstash should be also configured with ssl? Maybe the input in logstash configuration file is not an elastic_agent like for beats, but a csv file and only the output is on https

I would say your user which run LS doesn't have permission on elasticsearch-example.com.cer. Also temporarily replace ${pwd} with hardcoded value.

Set log.level=debug in logstash.yml to have more info.

After replacing the ${pwd} value with a specific path and setting the logstash log level to debug this is the output:

[DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"some-metrics/logstash.conf"}
[DEBUG][logstash.agent           ] Trying to start API WebServer {:port=>9600, :ssl_enabled=>false}
[DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
[DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"file", :type=>"input", :class=>LogStash::Inputs::File}
[DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@start_position = "beginning"
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@path = ["some-metrics/some-metrics.csv"]
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_completed_action = "log"
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@sincedb_path = "/dev/null"
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_completed_log_path = "some-metrics/done.log"
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@enable_metric = true
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@add_field = {}
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@stat_interval = 1.0
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@discover_interval = 15
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@sincedb_write_interval = 15.0
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@delimiter = "\n"
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@close_older = 3600.0
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@mode = "tail"
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@sincedb_clean_after = 1209600.0
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_chunk_size = 32768
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_chunk_count = 140737488355327
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_sort_by = "last_modified"
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@file_sort_direction = "asc"
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@exit_after_read = false
[DEBUG][logstash.inputs.file     ] config LogStash::Inputs::File/@check_archive_validity = false
[DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"csv", :type=>"filter", :class=>LogStash::Filters::CSV}
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@skip_header = true
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@separator = ","
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@columns = ["applicationIdentification", "applicationDescription", "applicationName"]
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@enable_metric = true
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@add_tag = []
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@remove_tag = []
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@add_field = {}
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@remove_field = []
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@periodic_flush = false
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@source = "message"
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@quote_char = "\""
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@autogenerate_column_names = true
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@skip_empty_columns = false
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@skip_empty_rows = false
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@convert = {}
[DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@autodetect_column_names = false
[DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"elasticsearch", :type=>"output", :class=>LogStash::Outputs::ElasticSearch}
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://elasticsearch-example.com:443]
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_enabled = true
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "some-metrics-%{+YYYY.MM}"
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "****"
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_authorities = ["some-metrics/cacerts/elasticsearch-example.com.cer"]
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_verification_mode = "full"
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_supported_protocols = []
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@silence_errors_in_log = []
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = true
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@compression_level = 1
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@custom_headers = {}
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@dlq_custom_codes = []
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@dlq_on_failed_indexname_interpolation = true
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_type = "logs"
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_dataset = "generic"
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_namespace = "default"
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_sync_fields = true
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_auto_routing = true
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_api = "auto"
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_enabled = "auto"
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_pattern = "{now/d}-000001"
[DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_policy = "logstash-policy"
[INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `input_throughput` in namespace `[:stats, :pipelines, :main, :flow]`
[DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `filter_throughput` in namespace `[:stats, :pipelines, :main, :flow]`
[DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `output_throughput` in namespace `[:stats, :pipelines, :main, :flow]`
[DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `queue_backpressure` in namespace `[:stats, :pipelines, :main, :flow]`
[DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_concurrency` in namespace `[:stats, :pipelines, :main, :flow]`
[DEBUG][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main"}
[INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://elasticsearch-example.com:443"]}
[DEBUG][logstash.outputs.elasticsearch][main] Normalizing http path {:path=>nil, :normalized=>nil}
[INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://****:xxxxxx@elasticsearch-example.com:443/]}}
[DEBUG][logstash.outputs.elasticsearch][main] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>"https://****:xxxxxx@elasticsearch-example.com:443/", :path=>"/"}
[ERROR][logstash.outputs.elasticsearch][main] Unable to retrieve Elasticsearch version {:exception=>LogStash::Json::ParserError, :message=>"Unexpected character ('<' (code 60)): expected a valid value (JSON String, Number, Array, Object or token 'null', 'true' or 'false')\n at [Source: (byte[])\"<!DOCTYPE html><html lang=\"en\"><head><meta charSet=\"utf-8\"/><meta http-equiv=\"X-UA-Compatible\" content=\"IE=edge,chrome=1\"/><meta name=\"viewport\" content=\"width=device-width\"/><title>Elastic</title><style>\n        \n        @font-face {\n          font-family: 'Inter';\n          font-style: normal;\n          font-weight: 100;\n          src: url('/ui/fonts/inter/Inter-Thin.woff2') format('woff2'), url('/ui/fonts/inter/Inter-Thin.woff') format('woff');\n        }\n\n        @font-face {\n          font-f\"[truncated 191025 bytes]; line: 1, column: 2]"}
[ERROR][logstash.javapipeline    ][main] Pipeline error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: Could not connect to a compatible version of Elasticsearch>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.19.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:277:in `block in healthcheck!'", "org/jruby/RubyHash.java:1587:in `each'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.19.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:262:in `healthcheck!'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.19.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:396:in `update_urls'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.19.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:104:in `update_initial_urls'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.19.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:98:in `start'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.19.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:369:in `build_pool'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.19.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:63:in `initialize'", "org/jruby/RubyClass.java:904:in `new'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.19.0-java/lib/logstash/outputs/elasticsearch/http_client_builder.rb:106:in `create_http_client'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.19.0-java/lib/logstash/outputs/elasticsearch/http_client_builder.rb:102:in `build'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.19.0-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:42:in `build_client'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-elasticsearch-11.19.0-java/lib/logstash/outputs/elasticsearch.rb:301:in `register'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:69:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:237:in `block in register_plugins'", "org/jruby/RubyArray.java:1987:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:236:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:610:in `maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:249:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:194:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:146:in `block in start'"], "pipeline.sources"=>["some-metrics/logstash.conf"], :thread=>"#<Thread:0x3e7095fd /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}

Do you have any ideas?

"Unexpected character ('<' (code 60)):

Check your nginx.conf ,maybe you have a wrong syntax in the HTTPS/443port part

This is the nginx.conf file:

$ cat /etc/nginx/nginx.conf
user nginx;
worker_processes auto;
error_log /var/log/nginx/error.log;
pid /run/nginx.pid;

include /usr/share/nginx/modules/*.conf;

events {
    worker_connections      2048;
}

http {
    log_format  main        '$remote_addr - $remote_user [$time_local] "$request" '
                            '$status $body_bytes_sent "$http_user_agent"';

    access_log              /var/log/nginx/access.log main buffer=32k flush=3m;

    sendfile                on;
    tcp_nopush              on;
    tcp_nodelay             on;
    keepalive_timeout       65;
    types_hash_max_size     2048;

    # disable checking of request body size
    client_max_body_size    0;

    # disable emitting nginx version on error pages
    server_tokens           off;

    # increase gateway timeout
    proxy_connect_timeout   600;
    proxy_send_timeout      600;
    proxy_read_timeout      600;
    send_timeout            600;

    include                 /etc/nginx/mime.types;
    default_type            application/octet-stream;

    include                 /etc/nginx/conf.d/*.conf;
}

This is the specific elasticsearch.conf inside /etc/nginx/conf.d/:

$ cat /etc/nginx/conf.d/elasticsearch.conf
upstream kibana {
    server localhost:5601 fail_timeout=0;
}

upstream elasticsearch {
    server localhost:9200 fail_timeout=0;
}

server {
    listen                  80;
    server_name             elasticsearch.linuxserver.com;
#     return                  301 https://$server_name$request_uri;
    location / {
        proxy_pass          http://elasticsearch;
    }
}

server {
    listen                  443 ssl;
    server_name             elasticsearch.linuxserver.com;
    ssl_certificate         /etc/pki/tls/apps/elasticsearch/elasticsearch.linuxserver.com.crt;
    ssl_certificate_key     /etc/pki/tls/apps/elasticsearch/elasticsearch.linuxserver.com.key;
    ssl_password_file       /etc/pki/tls/apps/elasticsearch/elasticsearch.linuxserver.com.pw;

    location / {
        add_header          Strict-Transport-Security "max-age=31536000; includeSubdomains; preload" always;
        add_header          Access-Control-Allow-Origin $http_origin;
        proxy_set_header    Host $host:$server_port;
        proxy_set_header    X-Real-IP $remote_addr;
        proxy_set_header    X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header    X-Forwarded-Proto $scheme;
        proxy_redirect      http:// https://;
        proxy_pass          http://kibana;
    }
}

The nginx configuration to be considerated is the elasticsearch.conf and the "elasticsearch.linuxserver.com" server can be also accessed with this DNS "elasticsearch-example.com".

Is there any syntax error inside? The server name inside elasticsearch.conf should be changed with " elasticsearch-example.com" or it doesn't matter?

Port 80 is forwarding to:

location / {
        proxy_pass          http://elasticsearch;
    }

Port 443 is forwarding to Kibana:

        proxy_pass          http://kibana;

With this configuration I should configure inside the logstash.conf file following

[...]
output {
  elasticsearch {
    hosts => ["https://elasticsearch-example.com:80"]
[...]
}

in order to work? Should be better not to foward elasticsearch or to foward to another port (normally port 80 is for http)?

I just tried by setting "https://elasticsearch-example.com:80" but it outputs warnings like this:

[INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://elasticsearch-example.com:80"]}
[INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://****:xxxxxx@elasticsearch-example.com:80/]}}
[WARN ][logstash.outputs.elasticsearch][main] Failed to perform request {:message=>"Unsupported or unrecognized SSL message", :exception=>Manticore::UnknownException, :cause=>#<Java::JavaxNetSsl::SSLException: Unsupported or unrecognized SSL message>}
[WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://****:xxxxxx@elasticsearch-example.com:80/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://elasticsearch-example.com:80/][Manticore::UnknownException] Unsupported or unrecognized SSL message"}
[INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"some-metrics-%{+YYYY.MM}"}
[INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[INFO ][logstash.filters.csv     ][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["some-metrics/logstash.conf"], :thread=>"#<Thread:0x23757c95 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.52}
[INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[INFO ][filewatch.observingtail  ][main] START, creating Discoverer, Watch with file and sincedb collections
[INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[WARN ][logstash.outputs.elasticsearch][main] Failed to perform request {:message=>"Unsupported or unrecognized SSL message", :exception=>Manticore::UnknownException, :cause=>#<Java::JavaxNetSsl::SSLException: Unsupported or unrecognized SSL message>}
[WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://****:xxxxxx@elasticsearch-example.com:80/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://elasticsearch-example.com:80/][Manticore::UnknownException] Unsupported or unrecognized SSL message"}

Unexpected character ('<' (code 60)): expected a valid value

You are connecting to port 443 and may be getting HTML in response. When connecting to ES you should be getting JSON back. Are you sure 443 is right?

1 Like

I think he has misconfigured elasticsearch.conf, the https connection is forwarding to Kibana host instead of ES. That is why LS reported a wrong response and dumped HTML code instead JSON.

1 Like

Thanks @Rios and @Badger for the given recommendations :smiley: :ok_hand:

I've changed the configuration of nginx for the elasticsearch server, namely /etc/nginx/conf.d/elasticsearch.conf in order that works on a different https port i.e. 8443. The result is that it works now!

Here's the new configuration:

upstream kibana {
    server localhost:5601 fail_timeout=0;
}

upstream elasticsearch {
    server localhost:9200 fail_timeout=0;
}

server {
    listen                  8443 ssl;
    server_name             elasticsearch.linuxserver.com;
    ssl_certificate         /etc/pki/tls/apps/elasticsearch/elasticsearch.linuxserver.com.crt;
    ssl_certificate_key     /etc/pki/tls/apps/elasticsearch/elasticsearch.linuxserver.com.key;
    ssl_password_file       /etc/pki/tls/apps/elasticsearch/elasticsearch.linuxserver.com.pw;
    location / {
        add_header          Strict-Transport-Security "max-age=31536000; includeSubdomains; preload" always;
        add_header          Access-Control-Allow-Origin $http_origin;
        proxy_set_header    Host $host:$server_port;
        proxy_set_header    X-Real-IP $remote_addr;
        proxy_set_header    X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header    X-Forwarded-Proto $scheme;
        proxy_redirect      http:// https://;
        proxy_pass          http://elasticsearch;
    }
}

server {
    listen                  443 ssl;
    server_name             elasticsearch.linuxserver.com;
    ssl_certificate         /etc/pki/tls/apps/elasticsearch/elasticsearch.linuxserver.com.crt;
    ssl_certificate_key     /etc/pki/tls/apps/elasticsearch/elasticsearch.linuxserver.com.key;
    ssl_password_file       /etc/pki/tls/apps/elasticsearch/elasticsearch.linuxserver.com.pw;

    location / {
        add_header          Strict-Transport-Security "max-age=31536000; includeSubdomains; preload" always;
        add_header          Access-Control-Allow-Origin $http_origin;
        proxy_set_header    Host $host:$server_port;
        proxy_set_header    X-Real-IP $remote_addr;
        proxy_set_header    X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header    X-Forwarded-Proto $scheme;
        proxy_redirect      http:// https://;
        proxy_pass          http://kibana;
    }
}

Do you have some reccomandations about the new and correct nginx configuration file, namely if some parts inside could be written once a time instead of twice (clean code)? If not, that's fine :slightly_smiling_face:

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.