#[Couldn’t find any Elasticsearch data
Hi All,
i have install elk as a docker container but issue not able to see es data in kibana
here is the http://95.217.144.48:9200/ in browser it show me
{
"name" : "92bbcfa738c4",
"cluster_name" : "docker-cluster",
"cluster_uuid" : "V-co8WltSd-1scKQKPMhBA",
"version" : {
"number" : "7.0.1",
"build_flavor" : "oss",
"build_type" : "docker",
"build_hash" : "e4efcb5",
"build_date" : "2019-04-29T12:56:03.145736Z",
"build_snapshot" : false,
"lucene_version" : "8.0.0",
"minimum_wire_compatibility_version" : "6.7.0",
"minimum_index_compatibility_version" : "6.0.0-beta1"
},
"tagline" : "You Know, for Search"
}
here is the curl command i try in terminal and i got 401 unauthorized
curl -XGET 'http://95.217.144.48:9200/_cat/indices?v&pretty'
401 Unauthorized
here is the logstash.conf
input {
tcp {
port => 5044
codec => "json"
}
}
## Add your filters / logstash plugins configuration here
output {
elasticsearch {
hosts => "elasticsearch:9200"
user => "xxxxx"
password => "xxxxx"
}
# stdout { codec => rubydebug }
}
here is the container of logs of logstash
Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2021-07-12T07:08:48,914][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.0.1"}
[2021-07-12T07:08:54,084][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@elasticsearch:9200/]}}
[2021-07-12T07:08:54,450][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@elasticsearch:9200/"}
[2021-07-12T07:08:54,603][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2021-07-12T07:08:54,613][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2021-07-12T07:08:54,729][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//elasticsearch:9200"]}
[2021-07-12T07:08:54,766][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, :thread=>"#<Thread:0xda578d3 run>"}
[2021-07-12T07:08:54,780][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2021-07-12T07:08:55,044][INFO ][logstash.inputs.tcp ] Automatically switching from json to json_lines codec {:plugin=>"tcp"}
[2021-07-12T07:08:55,135][INFO ][logstash.outputs.elasticsearch] Index Lifecycle Management is set to 'auto', but will be disabled - Index Lifecycle management is not installed on your Elasticsearch cluster
[2021-07-12T07:08:55,142][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2021-07-12T07:08:55,500][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2021-07-12T07:08:55,518][INFO ][logstash.inputs.tcp ] Starting tcp input listener {:address=>"0.0.0.0:5044", :ssl_enable=>"false"}
[2021-07-12T07:08:55,686][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2021-07-12T07:08:56,139][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
here is the docker-compose.yml there is also es, kibana and nginx all are define in this file
logstash:
# image: docker.elastic.co/logstash/logstash-oss:6.6.1
image: docker.elastic.co/logstash/logstash-oss:7.0.1
# image: docker.elastic.co/logstash/logstash:6.6.1
# image: docker.elastic.co/logstash/logstash:6.3.1
ports:
# - "5000:5000"
- "5044:5044"
- "9600:9600"
configs:
- source: logstash_config
target: /usr/share/logstash/config/logstash.yml:rw
- source: logstash_pipeline
target: /usr/share/logstash/pipeline/logstash.conf
volumes:
- logstash:/usr/share/logstash/data
# - ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml:rw
# - ./logstash/pipeline:/usr/share/logstash/pipeline:ro
environment:
LS_JAVA_OPTS: "-Xmx256m -Xms256m"
# xpack.monitoring.elasticsearch.url: "elasticsearch:9200"
# xpack.monitoring.elasticsearch.username: "xxxx"
# xpack.monitoring.elasticsearch.password: "xxxxx"
# networks:
# - host
deploy:
mode: replicated
replicas: 1
here is the logstash.yml
http.host: 0.0.0.0
path.config: /usr/share/logstash/pipeline
xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.password: changeme
xpack.monitoring.elasticsearch.url: http://elasticsearch:9200
xpack.monitoring.elasticsearch.username: elastic
I'm new to ELK and i have deploy 5 services and container they are kibana, es, logstash, nginx and caddy all this are working. when i open kibana in url it say the "Couldn't find any Elasticsearch data". Please tell me how to indexe es in kibana and also logstash is not showing in kibana UI