hello,
I use fluentd, elasticsearch 1.2.4 and kibana 3.1.2 on a ubuntu trusty platform to analyze logfiles I have.
I have an initial logfile that has to be taken into account.
In trying to get the stack workingk, I found out that when I start with a empty collection and I put the logfile in the fluentd-directory, Kibana will not show all or any records.
This is due to error messages in the ES log. The logging says things like
FacetParseElement.java:93)
at org.elasticsearch.search.SearchService.parseSource(SearchService.java:633)
... 9 more
[2016-02-10 09:38:25,344][DEBUG][action.search.type ] [Bling] [logstash-2016.02.08][1], node[cEvgxKytQbGKgsZxGZZXJQ], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@3ca82a1c] lastShard [true]
org.elasticsearch.search.SearchParseException: [logstash-2016.02.08][1]: from[-1],size[-1]: Parse Failure [Failed to parse source [{"facets":{"0":{"date_histogram":{"field":"@timestamp","interval":"1h"},"global":true,"facet_filter":{"fquery":{"query":{"filtered":{"query":{"query_string":{"query":"*"}},"filter":{"bool":{"must":[{"range":{"@timestamp":{"from":1454488705089,"to":1455093505089}}}]}}}}}}}},"size":0}]]
at org.elasticsearch.search.SearchService.parseSource(SearchService.java:649)
at org.elasticsearch.search.SearchService.createContext(SearchService.java:511)
at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:483)
at org.elasticsearch.search.SearchService.executeQueryPhase(SearchService.java:252)
at org.elasticsearch.search.action.SearchServiceTransportAction$5.call(SearchServiceTransportAction.java:206)
at org.elasticsearch.search.action.SearchServiceTransportAction$5.call(SearchServiceTransportAction.java:203)
at org.elasticsearch.search.action.SearchServiceTransportAction$23.run(SearchServiceTransportAction.java:517)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.search.facet.FacetPhaseExecutionException: Facet [0]: (key) field [@timestamp] not found
at org.elasticsearch.search.facet.datehistogram.DateHistogramFacetParser.parse(DateHistogramFacetParser.java:160)
at org.elasticsearch.search.facet.FacetParseElement.parse(FacetParseElement.java:93)
at org.elasticsearch.search.SearchService.parseSource(SearchService.java:633)
... 9 more
I first thought my data is not OK, but all records are good. I now have the situation that the json-logfile results in the above logging, but when I cut the file in two pieces and feed that to fluentd, kibana nor elasticsearch have problems.
This COULD be a fluentd issue, it could be a ES issue in processing large numbers of records, but I have not the slightest idea how to track this down. I would be very interested in the data that causes ES to give the stacktrace, but I have not a clue how to do it.
Has someone had a similar situation or advice on what to do next?
Ruud