Logstash unable to map field to es

I am using a multiline config in filebeat to send logs as one line, not sure if that is the cause of this.

I am currently getting this error when indexing certain log files.

response=>{"index"=>{"_index"=>"filebeat-2018.08.08", "_type"=>"log", "_id"=>"AWUbQVOf54TmqKKkRQHo", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [source] tried to parse field [source] as object, but found a concrete value"}}}}

Here is the full output.

[2018-08-08T20:36:30,765][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2018.08.08", :_type=>"log", :_routing=>nil}, 2018-08-08T20:36:23.367Z vdpsidxtst05 2018-08-08 20:36:22.512 ERROR (qtp401424608-643) [   ] o.a.s.s.HttpSolrCall null:org.apache.solr.common.SolrException: Core with name 'cpcr' already exists.
	at org.apache.solr.core.CoreContainer.create(CoreContainer.java:839)
	at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:88)
	at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:370)
	at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:388)
	at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:174)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:173)
	at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:748)
	at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:729)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:510)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:347)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:298)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1691)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
	at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
	at org.eclipse.jetty.server.Server.handle(Server.java:534)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:320)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
	at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
	at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
	at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
	at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
	at java.lang.Thread.run(Thread.java:745)
], :response=>{"index"=>{"_index"=>"filebeat-2018.08.08", "_type"=>"log", "_id"=>"AWUbQVOf54TmqKKkRQHo", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [source] tried to parse field [source] as object, but found a concrete value"}}}}
{

Here is the field mapping for source

"source": {
            "properties": {
              "class": {
                "type": "text",
                "fields": {
                  "keyword": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              }

LS creates a new index everyday for that days data. I ran into this yesterday and deleted the index and then it processed the record but that was a temporary fix.

I would much appreciate pointers on how to manage this. Thanks!

For some reason the source field is mapped as an object (with a class subfield) and then you can't index documents where source contains a concrete value. Which documents contain source.class fields? Are those incorrect? Or are the documents where source is a concrete value incorrect? Both can't be right.

Thank you for taking the time to reply magnusbaeck! I am very new to all things ELK so I am attempting to decipher your questions. I will update the thread once I have something of value.

Is this helpful?

sbdps-devops07{93}$ curl --user elastic:changeme -X GET "localhost:9200/_search?pretty=yes" -H 'Content-Type: application/json' -d' { "query": { "exists" : { "field" : "source" }}}'
{
  "took" : 920,
  "timed_out" : false,
  "_shards" : {
    "total" : 110,
    "successful" : 110,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : 22129,
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "filebeat-2018.08.02",
        "_type" : "log",
        "_id" : "AWT38XgvEhcjGUNcD1xa",
        "_score" : 1.0,
        "_source" : {
          "environment" : "tst",
          "logtype" : "application_log",
          "@timestamp" : "2018-08-02T00:00:50.887Z",
          "offset" : 1505290,
          "@version" : "1",
          "input_type" : "log",
          "beat" : {
            "name" : "vdpsidxtst05",
            "hostname" : "vdpsidxtst05",
            "version" : "5.5.0"
          },
          "host" : "vdpsidxtst05",
          "source" : "/local/mnt/logs/filebeat/apps/solr/solr-qdisk-new.log",
          "message" : "\tcommit{dir=NRTCachingDirectory(MMapDirectory@/prj/solr_master01/vdpsidxtst05/solr-qdisk-new/data/qdisk/data/index lockFactory=org.apache.lucene.store.NativeFSLockFactory@7d74fad9; maxCacheMB=48.0 maxMergeSizeMB=4.0),segFN=segments_4s77,generation=223171}",
          "type" : "log",
          "tags" : [
            "beats_input_codec_plain_applied",
            "_grokparsefailure"
          ]
        }
      },
      {
        "_index" : "filebeat-2018.08.02",
        "_type" : "log",
        "_id" : "AWT38XkXEhcjGUNcD1xg",
        "_score" : 1.0,
        "_source" : {
          "logtype" : "application_log",
          "environment" : "tst",
          "@timestamp" : "2018-08-02T00:00:50.887Z",
          "offset" : 1520205,
          "@version" : "1",
          "input_type" : "log",
          "beat" : {
            "name" : "vdpsidxtst05",
            "hostname" : "vdpsidxtst05",
            "version" : "5.5.0"
          },
          "host" : "vdpsidxtst05",
          "source" : "/local/mnt/logs/filebeat/apps/solr/solr-qdisk-new.log",
          "message" : "\tcommit{dir=NRTCachingDirectory(MMapDirectory@/prj/solr_master01/vdpsidxtst05/solr-qdisk-new/data/qdisk/data/index lockFactory=org.apache.lucene.store.NativeFSLockFactory@7d74fad9; maxCacheMB=48.0 maxMergeSizeMB=4.0),segFN=segments_4s7a,generation=223174}",
          "type" : "log",
          "tags" : [
            "beats_input_codec_plain_applied",
            "_grokparsefailure"
          ]
        }
      },

This may be helpful as well??

{
  "_index": "filebeat-2018.08.09",
  "_type": "log",
  "_id": "AWUc16lD54TmqKKkR8pR",
  "_version": 1,
  "_score": null,
  "_source": {
    "offset": 1809078,
    "level": "ERROR",
    "profile": "tst",
    "logger": "com.blah.ems.search.solr.agile.AgileDocumentIndexer",
    "module": "api",
    "input_type": "log",
    "source": {
      "file": "AgileDocumentIndexer.java",
      "method": "index",
      "class": "com.blah.ems.search.solr.agile.AgileDocumentIndexer",
      "line": 68
    },

OK, so in one you have

"source" : "/local/mnt/logs/filebeat/apps/solr/solr-qdisk-new.log"

and in the other you have

"source": {
  "file": "AgileDocumentIndexer.java",
  "method": "index",
  "class": "com.blah.ems.search.solr.agile.AgileDocumentIndexer",
  "line": 68
}

You cannot have the source field be an object in some documents and text in other documents in the same index. You would have to use multiple indexes.