Marvel issue

Had marvel working fine but now when I navigate to marvel in kibana, I see two greyed out clusters (should be one) and I see this error repeat over and over in the logs:

[2016-02-17 17:34:56,587][DEBUG][action.search.type ] [flash] [.marvel-es-2016.02.04][0], node[gLs5UVUpR2mMS5CloZZeGg], [R], v[17], s[STARTED], a[id=JT3LhEQGSwGGKTpGLMWPGQ]: Failed to execute [org.elasticsearch.action.search.SearchRequest@7c3af717] lastShard [true]
RemoteTransportException[[flash][172.16.0.100:9300][indices:data/read/search[phase/query]]]; nested: SearchParseException[failed to parse search source [{"size":0,"query":{"filtered":{"filter":{"term":{}}}},"aggs":{"indices":{"meta":{"cluster_uuid":"pqNwv3mgRMSR3NBGy4rZlA"},"terms":{"field":"shard.index","size":10000},"aggs":{"states":{"terms":{"field":"shard.state","size":10},"aggs":{"primary":{"terms":{"field":"shard.primary","size":10}}}}}},"nodes":{"meta":{"cluster_uuid":"pqNwv3mgRMSR3NBGy4rZlA"},"terms":{"field":"shard.node","size":10000},"aggs":{"index_count":{"cardinality":{"field":"shard.index"}}}}}}]]; nested: QueryParsingException[No value specified for term query];
Caused by: SearchParseException[failed to parse search source [{"size":0,"query":{"filtered":{"filter":{"term":{}}}},"aggs":{"indices":{"meta":{"cluster_uuid":"pqNwv3mgRMSR3NBGy4rZlA"},"terms":{"field":"shard.index","size":10000},"aggs":{"states":{"terms":{"field":"shard.state","size":10},"aggs":{"primary":{"terms":{"field":"shard.primary","size":10}}}}}},"nodes":{"meta":{"cluster_uuid":"pqNwv3mgRMSR3NBGy4rZlA"},"terms":{"field":"shard.node","size":10000},"aggs":{"index_count":{"cardinality":{"field":"shard.index"}}}}}}]]; nested: QueryParsingException[No value specified for term query];
at org.elasticsearch.search.SearchService.parseSource(SearchService.java:853)
at org.elasticsearch.search.SearchService.createContext(SearchService.java:652)
...
Caused by: [.marvel-es-2016.02.04] QueryParsingException[No value specified for term query]
...

it repeats for each of the marvel indexes from 2/4 - 2/17 and .marvel-es-data

If I do GET _cat/indices?v in sense, I see that all of the indices including the .marvel ones are green.

I did recently update to the latest versions of everything and install a new, free marvel license.

I suspect I have this issue: Marvel 2.1 basic licence. Cannot access cluster data

Why do I show two clusters?

No one? I'm a bit surprised that I haven't gotten any response from an elastic person?

Hi Jerry,

The last two weeks have been busy for Elastic. We had Elastic{ON} two weeks ago, then a busy week following that.

Excuses aside, I have not honestly come across this issue before. I did notice in your screenshot that the "two" clusters have different license expiration dates.

I did recently update to the latest versions of everything and install a new, free marvel license.

  • Which version of ES are you running?
  • What version did you upgrade from?
  • Is the monitoring cluster separate or contained in the same cluster?
  • To be explicit: you are not running two separate clusters, right?

Let us know.

Fair enough :slightly_smiling: hope Elastic{ON} went well? Wish I could have been there...

Running the latest version of ES - 2.2.0
Upgrade was from 2.1.1 to 2.2.0
The cluster is only 2 machines and monitoring is on the same cluster
And no, I'm definitely not running two clusters.

I did recently change the IP address (and the IP address in several places in elasticsearch.yml) due to a strange Azure issue I had - there was some sort of networking issue with the VM and changing it's IP address "fixed" it. My recollection is that happened before the upgrade but I'm not positive.

Hi Jerry,

The problem most likely deals with IP change and the upgrading of the license, which sounds like it's confusing it. Without having access to your cluster (not asking for it :)), and given how out of date the Marvel data is, I think the easiest course of action is to just remove the defunct data:

curl -XDELETE monitor-host:9200/.marvel-*

I normally would not suggest deleting the data, but it's going to be the easiest way to start fresh.

Hope that helps.

1 Like

That did the trick - thank you!