Single field object to Elasticsearch using Logstash

Hi Support,

I have group concat field values like :slight_smile:

Document_Type
1,2,3,4,5,6

In Elasticsearch field mapping with object type

"Document_Type": {
        "properties": {
          "d": {
            "type": "short"
          }
        }
 }

When I run Logstash getting this "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [Document_Type] tried to parse field [Document_Type] as object, but found a concrete value"}}}}

Can you please help me?

Thanks

Hi there,

it means that in your mapping you defined that Document_Type as a object (in fact it looks like an object) but you're passing it as something different (in fact what you wrote there under that bold Document_Type is an array of numbers, not an object).

Can you please share a sample of input doc and the complete mapping for that index?

Hi Fabio,

Thanks for the reply.

Yes, You understood well.

Below is the mapping and example.

DELETE testindex
PUT testindex
PUT testindex/_mappings
{
"properties": {    
      "Document_Type": {
        "type": "object",
        "properties": {
          "d": {
            "type": "short"
          }
        }
      },
	  "filename": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "fileid": {
        "type": "long"
      }
    }
}
POST testindex/_doc
{
  "Document_Type":[{"d":1},{"d":1}],
  "fileid" : 1,
  "filename" :"elasticsearch filename.pdf"
}
GET testindex/_search

Above example working fine, But when indexing using logstash.

input {
  jdbc { 
    jdbc_connection_string => "jdbc:mysql://localhost:3306/test?useCursorFetch=true"
    # The path to our downloaded jdbc driver
    jdbc_driver_library => "mysql-connector-java-8.0.18.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
       # our query
    statement => "SELECT a.fileid, a.filename, group_concat(b.document_type) as Document_Type FROM file_table AS a LEFT JOIN file_doctype AS b ON a.fileid = b.fileid group by a.fileid order by a.fileid"    
    }
  }
  
output {
stdout { codec => rubydebug }
  elasticsearch {
    hosts => ["localhost:9200"]
    index => ["testindex"]
  }
}

I am getting this "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [Document_Type] tried to parse field [Document_Type] as object, but found a concrete value"}}}}

Now can you clear me on this?

Thanks

Any Hope?

Hi there,

I'm sorry I've been kinda busy and didn't see the notification. Anyway, can you share the output of logstash? I mean, you're sending to both elasticsearch and standard output.

What is written in the stdout?

Hi Fabio,

I have resolved it.

Thanks

Very good!

Would you mind posting here your solution and mark it as the solution? It might be useful for future users.

Sure Fabio.

Below is the solution.

 ruby {
  code => "
    r = []
	data = event.get('Document_Type').split(',')
    	data.each { |values|        
        item = {
            'd' => values,            
        }
        r << item
    }
    event.set('Document_Type', r)
  "
	}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.