I am using a multiline config in filebeat to send logs as one line, not sure if that is the cause of this.
I am currently getting this error when indexing certain log files.
response=>{"index"=>{"_index"=>"filebeat-2018.08.08", "_type"=>"log", "_id"=>"AWUbQVOf54TmqKKkRQHo", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [source] tried to parse field [source] as object, but found a concrete value"}}}}
Here is the full output.
[2018-08-08T20:36:30,765][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2018.08.08", :_type=>"log", :_routing=>nil}, 2018-08-08T20:36:23.367Z vdpsidxtst05 2018-08-08 20:36:22.512 ERROR (qtp401424608-643) [ ] o.a.s.s.HttpSolrCall null:org.apache.solr.common.SolrException: Core with name 'cpcr' already exists.
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:839)
at org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$0(CoreAdminOperation.java:88)
at org.apache.solr.handler.admin.CoreAdminOperation.execute(CoreAdminOperation.java:370)
at org.apache.solr.handler.admin.CoreAdminHandler$CallInfo.call(CoreAdminHandler.java:388)
at org.apache.solr.handler.admin.CoreAdminHandler.handleRequestBody(CoreAdminHandler.java:174)
at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:173)
at org.apache.solr.servlet.HttpSolrCall.handleAdmin(HttpSolrCall.java:748)
at org.apache.solr.servlet.HttpSolrCall.handleAdminRequest(HttpSolrCall.java:729)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:510)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:347)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:298)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1691)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.eclipse.jetty.server.Server.handle(Server.java:534)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:320)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
at java.lang.Thread.run(Thread.java:745)
], :response=>{"index"=>{"_index"=>"filebeat-2018.08.08", "_type"=>"log", "_id"=>"AWUbQVOf54TmqKKkRQHo", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [source] tried to parse field [source] as object, but found a concrete value"}}}}
{
Here is the field mapping for source
"source": {
"properties": {
"class": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
LS creates a new index everyday for that days data. I ran into this yesterday and deleted the index and then it processed the record but that was a temporary fix.
I would much appreciate pointers on how to manage this. Thanks!