Discover tab; showing no results

Hi All,

discover tab shows no results found..index patter i have selected @timestamp, i am able to see the indexes.
my kibana and elastic search are both @ version 5.6.1

Can you please let me know whats missing. I have 2 index pattern one is logstash and other is fsimage. fsimage is the index i am having issues with

filter {
if [type] == "fsimagedaily" {

    csv {
            separator => "|"
            columns => [ "HDFSPath", "replication", "ModificationTime", "AccessTime", "PreferredBlockSize", "BlocksCount", "FileSize", "NSQUOTA", "DSQUOTA", "permission", "user", "group" ]
            convert =>  {
                    'replication' => 'integer'
                    'PreferredBlockSize' => 'integer'
                    'BlocksCount' => 'integer'
                    'FileSize' => 'integer'
                    'NSQUOTA' => 'integer'
                    'DSQUOTA' => 'integer'
                            }
        }
    date {
            match => ['ModificationTime', 'YYYY-MM-ddHH:mm']
            target => "modifyTime"
            remove_field => ['ModificationTime']
            }

#date {
#match => ['AccessTime', 'YYYY-MM-ddHH:mm']
#}

    date {
            match => ['AccessTime', 'YYYY-MM-ddHH:mm']
            target => "accessTime"
            remove_field => ['AccessTime']
            }

Based on looking at your pipeline, are you sure the timestamp field shouldn't be modifyTime or accessTime? You selected @timestamp but none of your date filters have a target to a @timestamp field.

Are you able to see fsimage data if you search for it in Dev Tools Console?

POST /fsimage/_search
{
  "query": {
    "match_all": {}
  },
  "sort": [
    {
      "@timestamp": { // [1]
        "order": "desc"
      }
    }
  ]
}

[1] This will be the timestamp field that you created the index pattern with. If you aren't seeing what you expect with this query, try replacing with modifyTime or accessTime

Hi,

Please let me know where I am going wrong.

I did try with @timestamp, accesstime and modifytime I get the following error for all the 3:

 Deprecation: Content type detection for rest requests is deprecated. Specify the content type using the [Content-Type] header.

{
"error": {
"root_cause": [
{
"type": "index_not_found_exception",
"reason": "no such index",
"resource.type": "index_or_alias",
"resource.id": "fsimage",
"index_uuid": "na",
"index": "fsimage"
}
],
"type": "index_not_found_exception",
"reason": "no such index",
"resource.type": "index_or_alias",
"resource.id": "fsimage",
"index_uuid": "na",
"index": "fsimage"
},
"status": 404
}
---------------------------------------------------------------------------------------------------------------
When i execute this:
GET _search
{
"query": {
"match_all": {}
}
}

I get the following output for fsimage:

{
"_index": ".kibana",
"_type": "search",
"_id": "4e265220-0f3f-11e7-be1a-897e9d3c543f",
"_score": 1,
"_source": {
"title": "fsimage",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": """{"index":"logstash-","query":{"query_string":{"analyze_wildcard":true,"query":""}},"filter":[{"meta":{"negate":false,"index":"logstash-","key":"type","value":"fsimage","disabled":false,"alias":null},"query":{"match":{"type":{"query":"fsimage","type":"phrase"}}},"$state":{"store":"appState"}}],"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"":{}},"require_field_match":false,"fragment_size":2147483647}}"""
}
}
},
{
"_index": ".kibana",
"_type": "search",
"_id": "c7b94ae0-1320-11e7-8397-e7424a7c9acf",
"_score": 1,
"_source": {
"title": "access time",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": """{"index":"logstash-2017*","query":{"query_string":{"analyze_wildcard":true,"query":"accessTime:[2017-03-01 TO 2017-03-25]"}},"filter":[],"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"":{}},"require_field_match":false,"fragment_size":2147483647}}"""
}
}
},
{
"_index": ".kibana",
"_type": "search",
"_id": "abbf3400-1a2b-11e7-a800-a39adf5cfee2",
"_score": 1,
"_source": {
"title": "hive databases",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": """{"index":"test","query":{"query_string":{"query":"HDFSDEPTH1: apps AND HDFSDEPTH2: hive","analyze_wildcard":true}},"filter":[],"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"
":{}},"require_field_match":false,"fragment_size":2147483647}}"""
}
}
},
{
"_index": ".kibana",
"_type": "search",
"_id": "2301a3e0-1969-11e7-8c16-7f4f956c0889",
"_score": 1,
"_source": {
"title": "fsimage",
"description": "",
"hits": 0,
"columns": [
"_source"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": """{"index":"logstash-","query":{"query_string":{"analyze_wildcard":true,"query":"type: fsimage"}},"filter":[],"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"":{}},"require_field_match":false,"fragment_size":2147483647}}"""
}
}
},
{
"_index": ".kibana",
"_type": "search",
"_id": "2a8c4640-6b19-11e7-877e-ed8345e7592b",
"_score": 1,
"_source": {
"title": "test",
"description": "",
"hits": 0,
"columns": [
"HDFSPath",
"FileSize"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": """{"index":"fsimage*","query":{"query_string":{"analyze_wildcard":true,"query":""}},"filter":[],"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"":{}},"require_field_match":false,"fragment_size":2147483647}}"""
}
}
}
]
}
}

The // [1] part of the JSON query wasn't supposed to be part of the query. Sorry for the confusion.

Based on the query for /fsimage/_search returning a 404, it looks like you don't have that data in your Elasticsearch cluster. Try going to http://localhost:9200/_cat/indices to see what indices you actually have, or try out X-Pack Monitoring to see a list of your indices.

If "fsimage" data isn't coming in to your cluster, then there probably is an issue with your Logstash pipeline. Looks like you gave the filter section for the fsimage pipeline. What does the output section look like?

Just to be clear, that search looks at all the indices in your cluster, and the results that you have pasted are only for the .kibana system index. There's no "output for fsimage" here.

Hi,
Thanks a lot for looking into the issue

in logstash this is my output.conf
output {
if [type] == "fsimagedaily" {
elasticsearch {
hosts => ["sandbox.com:9200" ]
index => "fsimage-%{+YYYY.MM.dd}"
}}
else {
elasticsearch {
hosts => ["sandbox.com9200" ]
}}

When i execute :_cat/indices
yellow open logstash-2017.08.25 cyuov7VgSz6TUXBcV617PA 6 1 42315 0 29.4mb 29.4mb
yellow open fsimage-2017.05.11 lVKGs0_ySMWyi_pJ7tKkzw 5 1 2169786 0 1.9gb 1.9gb
yellow open fsimage-2017.07.18 21E5y9ajQXCWYmWYWOgzkg 5 1 113860 0 108.4mb 108.4mb
yellow open logstash-2017.08.01 vKpvnzOPRSi6poLu1vlY1g 6 1 123979 0 77.2mb 77.2mb
yellow open fsimage-2017.05.08 P-B3EqqdR0eKEQopdR_vvA 5 1 2086485 0 1.8gb 1.8gb
yellow open fsimage-2017.07.11 LXbAgeUFSYuOv49Ne6RP1w 5 1 83248 0 74.9mb 74.9mb
yellow open fsimage-2017.05.12 LCeKEUR2RGGxnd-NL97ZxA 5 1 2208981 0 1.9gb 1.9gb

The fact that you have fsimage-*indices in the cluster is a good sign. It looks like the 404 code that you received was because I asked you to query for /fsimage when it should have been /fsimage-*

POST /fsimage-*/_search
{
  "query": {
    "match_all": {}
  },
  "sort": [
    {
      "the_time_field": {
        "order": "desc"
      }
    }
  ]
}

In Kibana, the "fsimage" index pattern should be based on fsimage-*, and you should specify the date field with a valid date field in your data. Try the query above, and replace the_time_field with a field that makes sense for your data.

Hope that helps.

Hi
I am not sure whats the issue..till sep 5th i have the data post that its not working

I see this error in logstash logs:
[2017-09-22T10:32:19,855][ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Expected one of #, input, filter, output at line 482, column 1 (byte 14212) after "}
[2017-09-22T10:32:26,217][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2017-09-22T10:32:26,219][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2017-09-22T10:32:26,512][ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Expected one of #, input, filter, output at line 482, column 1 (byte 14212) after "}

And when i start logstash via ./logstash -f fsimagedaily.conf i get the below msg but script executes, is because of which data is not populating:

WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path //usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
17/09/22 10:10:07 INFO namenode.TransferFsImage: Opening connection to http://sandbox.com:50070/imagetransfer?getimage=1&txid=latest

I think this is more of a Logstash question, but it seems your Logstash configuration has a syntax problem and Logstash can't use it.

There's an option to have Logstash check the syntax of your config file for you, documented at: Running Logstash from the Command Line | Logstash Reference [8.11] | Elastic

An example of how to run this check is:

logstash-5.6.1/bin/logstash -t -f my-pipeline.conf

Hi,

Thanks a lot, its working now...but there are data discrepancies..data on the actual cluster doesn't match with the data in kibana...i am extracting the fsimage from hadoop cluster , filtering out the data via csv filter and when projected in kibana there are data discrepancies.
If folder A is 500GB in cluster its 380GB in kibana

How can this be rectified?

I'm not sure where the CSV input filter comes into this, because I don't know Hadoop very well, but have you looked into simply using ES-Hadoop? Elasticsearch for Hadoop | Elastic

How are you determining the numbers in Elasticsearch? Are you still just looking at the Discover tab to see the same numbers in Kibana?

Check the query that the Discover app is using to fetch the data and check that against the current method you are using to see the data in Elasticsearch.

You can see the query that Discover is using by looking at the the Request tab of the Spy panel: https://www.elastic.co/guide/en/kibana/master/vis-spy.html

Thanks for the suggestion

i need to project the data used by diff directories/applications in hadoop cluster over a time period.
fsimage contains the details about hdfs hence i am extracting the fsimage and using csv filter , offline image viewer tool i am projecting the contents of fsimage in kibana

Regarding es-hadoop i am not sure how will i achieve that. any documentation will be helpful

Regarding es-hadoop: you can visit : https://www.elastic.co/guide/en/elasticsearch/hadoop/current/index.html

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.