You need to format your post using markdown. If you Google "markdown tutorial" you will find multiple sites that provide one. Use the preview pane on the right of the edit pane to make sure the code is formatted correctly.
What error do you get? What does the [message] field of your event look like? (Expand an event in the Discover pane of Kibana and copy and paste from the JSON tab.)
As I said, if the [message] field contains the text you said it contains then the filter configuration I posted will work. If it contains something else then it may not. Given that what you posted is not valid XML I suspect the [message] field may be slightly different.
Hello @Badger , there is an XML in the message field in kibana. Is there any way to confirm the same ?
I am new to logstash and kibana. Can you please let me know if any way to check what is in message field
Earlier today (my today, possibly your yesterday) you started another thread, which I spent some time working on, and then you deleted it before I could post my answer.
An XML filter expects a single XML element to surround everything in the source field. If there are two or more top-level XML elements then it will complain about trying to add a second item at the root.
You may be able to fix the message using something like
And I realize that there was a ton of information in that post that you probably did not want to share. But it is hard for us to help you without a reproducible failure. If you do provide one then I, and several other folks, will be happy to test it and help.
Spending time to narrow down a reproducible example adds a skill that will let you get more answers from more people.
As an example of reproduction... if you have a grok pattern that include IPV4, do not obfuscate your IP address as "a.b.c.d" (which is not a valid IP address), just replace it with "1.2.3.4" (which is valid). It is a trivial change for you and makes testing things easier for every person who reviews questions here, and much more likely that one of us will take the time to provide guidance.
I tried above but it did not create new field in document in kibana with name result when xml was parsed by logstash and sent to Elasticsearch.
do i need to add something else to put the value fetched by xpath and create a new field out of it ?
It seems like my filebeat which is sending source xml file(build.xml) in message field to logstash is getting corrupted while being transfered. I tried to test both xml from source location and message field. source location xml file is in correct format. but when it is sent in message field it's structure is getting changed.
Do you know any way to resolve this ?
I also want <startTime> to be extracted from xml in "message" field which has epoc time as value and create a new field for it as well after converting epoc to UNIX time
I tried below code to extract both fields, but it just created <startTime> field and not result field
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.