RhysEvans
(Rhys Evans)
November 8, 2019, 9:31pm
1
Hi
I am having an issue were I have 2 fields, source.port and source.ip , that keep changing the format in kibana. These are coming in via logstash
This is what I am expecting
Json Output (sub section)
},
"source": {
"port": "34705",
"ip": "127.0.0.2"
},
"network": {
"protocol": "dns",
This is wrong
Json Output (sub section)
},
"source": {
"ip": "127.0.0.2",
"port": "34705"
},
"data": {
"srcip": "127.0.0.2",
When looking at the kibana index pattern during the issue , I see the following (see Fields count)
I then refresh the kibana index pattern and get the following (see Fields count)
!
This then returns the results as expected again.
This will then happen a bit later and I have to go through the same process again
Please note the index in ES is the same index and as such its mappings haven't changed.
I am running ES and Kibana 7.3.2
Any ideas ?
Any help is appreciated
Thanks
RhysEvans
(Rhys Evans)
November 9, 2019, 8:15pm
2
So when this isn't working the source "field" is showing as a string, a kibana refresh corrects this, and is then the source field is broken down into the elastic template mappings (ECS based), This will revert to s string later
RhysEvans
(Rhys Evans)
November 11, 2019, 10:49am
3
Bump, anyone have any ideas? Any other info required to help resolve this ?
Note I have upgraded, to 7.4.2, on one of the instances showing this issue and am still getting the same problem
Any help is appreciated
Thanks
mattkime
(Matthew Kime)
November 11, 2019, 6:20pm
4
I have two paths to follow to figure out what is going on. Is the field map changing? Are you using an index template? Are you using index lifecycle management? https://www.elastic.co/guide/en/elasticsearch/reference/current/index-lifecycle-management.html
The other thought is that maybe a non-conforming document is being added. If I'm understanding the problem correctly, in one case the source subfields are properly being analyzed and in the other case its simply being treated as a single field. How is the data being ingested?
RhysEvans
(Rhys Evans)
November 11, 2019, 8:41pm
5
Hi
Yep we are using an index template (well a chain of templates, 3 or 4)
No, field mapping doesn't seem to be changing in ES
I am not using ILM
Data is being ingested via wazuh agents / syslog , via logstash, processed and then output into elasticsearch.
So after all of that I think I may have found the problem. The Wazuh kibana app seems to be loading a known_fields, regularly, mapping into kibana , https://raw.githubusercontent.com/wazuh/wazuh-kibana-app/master/server/integration-files/known-fields.js , Here the source field has been defined as text. So I have removed the source "mapping" from the file and am busy testing now.
I have also raised the question around why this is in place, via https://groups.google.com/forum/#!topic/wazuh/tz6PCPPSdq8 and am awaiting a response.
Thanks for you help so far
RhysEvans
(Rhys Evans)
November 13, 2019, 12:50pm
6
Hi
Just to confirm this was down to the thirdparty app.
Thanks
system
(system)
Closed
December 11, 2019, 12:50pm
7
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.