Hello Elastic Forum,
I know that this question has been asked many times before, however I am pretty sure that I have logstash successfully creating an index, but I am still getting the error. Hopefully someone can see the errors in my ways.
I have the Elastic stack, ElasticSearch (5.2.0), Logstash (5.2.0), Kibana (5.2.0) running on a single Windows Server 2016 VM, and for my initial test I am passing log4net files into the logstash pipeline using a file beat on the remote application server.
After the initial configuration I can see the default indexes by calling http://localhost:9200/_all:
PS C:\Windows\system32> $all = (Invoke-WebRequest -uri http://localhost:9200/_all -Credential $creds).content | ConvertFrom-Json
PS C:\Windows\system32> $all | fl
.monitoring-es-2-2017.02.03 : @{aliases=; mappings=; settings=}
.kibana : @{aliases=; mappings=; settings=}
.security : @{aliases=; mappings=; settings=}
.monitoring-data-2 : @{aliases=; mappings=; settings=}
I then created my logstash.conf file, like so:
input {
beats {
port => "5043"
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
user => "elastic"
password => "XXXXXX"
}
}
I figured that by not putting a filter block in I would rule out any filter syntax errors, and I will configure the filter later.
After starting the filebeat I can then see that the logstash index has been created:
PS C:\Windows\system32> $all = (Invoke-WebRequest -uri http://localhost:9200/_all -Credential $creds).content | ConvertFrom-Json
PS C:\Windows\system32> $all | fl
.monitoring-es-2-2017.02.03 : @{aliases=; mappings=; settings=}
.kibana : @{aliases=; mappings=; settings=}
.security : @{aliases=; mappings=; settings=}
logstash-2017.02.03 : @{aliases=; mappings=; settings=}
.monitoring-data-2 : @{aliases=; mappings=; settings=}
However I am still getting the unable to fetch mapping error:
I have checked the stats of the Index and I can see that there are 18 docs, which correlates to the number of lines in the test document I am using:
PS C:\Windows\system32> $logS.indices.'logstash-2017.02.03'.primaries
docs : @{count=18; deleted=0}
store : @{size_in_bytes=59569; throttle_time_in_millis=0}
indexing : @{index_total=18; index_time_in_millis=197; index_current=0; index_failed=0; delete_total=0; delete_time_in_millis=0;
delete_current=0; noop_update_total=0; is_throttled=False; throttle_time_in_millis=0}
get : @{total=0; time_in_millis=0; exists_total=0; exists_time_in_millis=0; missing_total=0; missing_time_in_millis=0; current=0}
search : @{open_contexts=0; query_total=0; query_time_in_millis=0; query_current=0; fetch_total=0; fetch_time_in_millis=0;
fetch_current=0; scroll_total=0; scroll_time_in_millis=0; scroll_current=0; suggest_total=0; suggest_time_in_millis=0;
suggest_current=0}
merges : @{current=0; current_docs=0; current_size_in_bytes=0; total=0; total_time_in_millis=0; total_docs=0; total_size_in_bytes=0;
total_stopped_time_in_millis=0; total_throttled_time_in_millis=0; total_auto_throttle_in_bytes=104857600}
refresh : @{total=10; total_time_in_millis=262; listeners=0}
flush : @{total=5; total_time_in_millis=161}
warmer : @{current=0; total=20; total_time_in_millis=8}
query_cache : @{memory_size_in_bytes=0; total_count=0; hit_count=0; miss_count=0; cache_size=0; cache_count=0; evictions=0}
fielddata : @{memory_size_in_bytes=0; evictions=0}
completion : @{size_in_bytes=0}
segments : @{count=6; memory_in_bytes=34824; terms_memory_in_bytes=32196; stored_fields_memory_in_bytes=1872;
term_vectors_memory_in_bytes=0; norms_memory_in_bytes=0; points_memory_in_bytes=12; doc_values_memory_in_bytes=744;
index_writer_memory_in_bytes=0; version_map_memory_in_bytes=0; fixed_bit_set_memory_in_bytes=0;
max_unsafe_auto_id_timestamp=1486098404861; file_sizes=}
translog : @{operations=0; size_in_bytes=215}
request_cache : @{memory_size_in_bytes=0; evictions=0; hit_count=0; miss_count=0}
recovery : @{current_as_source=0; current_as_target=0; throttle_time_in_millis=0}
I can also see the index being created on the file system with the same uuid as listed in the index:
PS C:\Windows\system32> $logS = (Invoke-WebRequest -uri http://localhost:9200/logstash-2017.02.03 -Credential $creds).content | ConvertFrom-Json
PS C:\Windows\system32> $logS.'logstash-2017.02.03'.settings.index
refresh_interval : 5s
number_of_shards : 5
provided_name : logstash-2017.02.03
creation_date : 1486098404736
number_of_replicas : 1
uuid : gFrYacYdSludHo6zDvVIIw
version : @{created=5020099}
I have had a look in the Kabana log and I can see a 404 error when it is trying to get the mappings:
"method":"get","statusCode":404,"req":{"url":"/elasticsearch/logstash-*/_mapping/field/*?_=1486100447579&ignore_unavailable=false&allow_no_indices=false&include_defaults=true"
Hopefully there is enough info here for someone to pick up what is wrong, but if not please let me know.
Thanks,
Tim