Logstash Parser error - tried to parse field as object, but found a concrete value

Hi,

I am trying to parse a log but i have this error.

[source] tried to parse field [source] as object, but found a concrete value

if "string" in [tags] {
      grok {            
          match => [ "message", "(?<ts>(.*?))\t(?<fuid>(.*?))\t(?<tx_hosts>(.*?))\t(?<rx_hosts>(.*?))\t(?<conn_uids>(.*?))\t(?<source>(.*?))\t(?<depth>(.*?))\t(?<analyzers>(.*?))\t(?<mime_type>(.*?))\t(?<filename>(.*?))\t(?<duration>(.*?))\t(?<local_orig>(.*?))\t(?<is_orig>(.*?))\t(?<seen_bytes>(.*?))\t(?<total_bytes>(.*?))\t(?<missing_bytes>(.*?))\t(?<overflow_bytes>(.*?))\t(?<timedout>(.*?))\t(?<parent_fuid>(.*?))\t(?<md5>(.*?))\t(?<sha1>(.*?))\t(?<sha256>(.*?))\t(?<extracted>(.*))" ]
       } 
        mutate { 
          add_tag => ["hello world from source"] 
          convert => [ "source", "string" ]
      } 
    }

The actual log

1295981542.761080	FLNLOJ2zgI814vI3Lh	72.14.213.102	192.168.3.131	COcbTZ3MjJb30W6Wba	HTTP	0	(empty)	text/json	-	0.000000	-	F	273	-	0	0	F	-	-	-	-	-	-	-

Thank you in advance!

There are a boatload of threads in this forum that discuss this. Here is one.

is not normal to find a concrete value? we still have to parse a value if exists or not

Try this test, hopefully this makes it clearer

POST test/_doc
{
  "myfield" : "myvalue",
  "myotherfield" :
  {
    "mysubfield1" : "mysubvalue1",
    "mysubfield2" : "mysubvalue2"
  }
}

Then Post

POST test/_doc
{
  "myfield" : "myvalue",
  "myotherfield" : "myconcretevalue"
}

and you will get this error.

{
  "error" : {
    "root_cause" : [
      {
        "type" : "mapper_parsing_exception",
        "reason" : "object mapping for [myotherfield] tried to parse field [myotherfield] as object, but found a concrete value"
      }
    ],
    "type" : "mapper_parsing_exception",
    "reason" : "object mapping for [myotherfield] tried to parse field [myotherfield] as object, but found a concrete value"
  },
  "status" : 400
}

This is because the first document created a mapping where myotherfield is an object ...

You can see the mapping by running

GET /test/

then when you try to post a document that has myotherfield as a simple concrete field/ data type it throws an error , that field can not be both types.

The mapping (schema) is static for each field

Either a field is and object or a simple data type or and array etc ... once the type is defined that type is "static" not "dynamic". All documents to be indexed need to adhere to the mapping. You either need to put it in a different field or not index that document.

You can clean that up and do it it the opposite order... then the type will be a simple field (keyword and text) then if you try to add the doc with the sub object it will complain with a different error.

Once the mapping is defined for a field... the data type is static.

So in your case you have logs coming in where some fields get defined as an object and some logs where that same field is a simple concrete value... you need to figure out which and solve for it.

3 Likes

Thank you! As far as I can see the field is a string and I don’t lnow why it parsed as an object. Thank you for your answer!

Do you know how this error can be solved?

One solution is partially parse the message and using some other identifier to conditionally split the parsing one for an object and one for a concrete value and put them in different fields.

can you please give me an example ==> a programming one

or how can I change the mapping type of a field? is is object to make it text? how this cast can be done?

You can't just "Cast" an object to a mapping...

Perhaps @Badger can show you how to toString() something.

This is not an easy problem to solve...

Question is how many logs does this affect 50%... 1% .... 0.00002%

That would be what I focus on and solve for the majority first... then work on the left overs

Is there an identifier that you could sort on.?
Can you post 1 log that has a concrete and 1 log that has an object

Perhaps we can take a look but I can't really write your code for you.

this happens to 3 different types of logs. Please note that all the tools have 7.12 version.

This isn't an issue with the Stack Version this is an issue that you have 3 types of Logs with different structure you are trying to force into a single schema, you can do it but it will take some work.

Can you provide a sample of the 3 types of logs... if not we certainly can not help.

I asked you other questions that you did not answer... I asked them for specific reasons.

Do you know the ratio of the logs, start with that one first. Get it working, then move on to the next.

That is my suggestion, if you want help ... you need to provide the answers and samples I / we requested.

thank you! I've split the fields and renamed the one with the incorrect type and worked like a charm. Thank you for your support!

1 Like

Nice ...

@Automation_Scripts We would ask you to share your solution to help others ... please :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.