Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {}


(Akash John) #1

Hi Team,

We are using filebeat-6.0.1-1 and trying to parse a custom log to ES (5.5.x),

{ "Level":"DEBUG", "Date": "2018-06-04 20:11:24.277", "Thread": "[https-jsse-nio-8443-exec-9]", "Context": "RestProcessor", "Log": {"user":"","action":"Invoke Vault API","message":"Calling the vault end point [https://127.0.0.1:8200/v1/auth/token/revoke-self] using POST method","apiurl":"https://10.65.57.100/app/auth/tapp/revoke"} }

But we are getting an error as given below,

2018-06-04T22:13:23Z ERR Error decoding JSON: unexpected EOF 2018-06-04T22:13:23Z ERR Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {} 2018-06-04T22:13:23Z ERR Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {} 2018-06-04T22:13:23Z ERR Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {} 2018-06-04T22:13:23Z ERR Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {} 2018-06-04T22:13:23Z ERR Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {} 2018-06-04T22:13:23Z ERR Error decoding JSON: invalid character '}' looking for beginning of value

Could you please let us know how can we fix this issue?

Filebeat configuration used is attached in this ticket. 32%20PM


Trouble parsing multi-line json logs
(Jaime Soriano) #2

Are all lines in the log file valid JSON?


(Akash John) #3

Yes, they are valid. I tested it on json viewer and json editor


(Jaime Soriano) #4

I cannot reproduce this issue, could you explain more about these logs? Do these errors appear always? Do they appear after a while?

One thing I have seen is that you are setting json.message_key to log, but there is no field named log. You only need to set this field if you want to apply some filetering or multiline rule on the message.


(Akash John) #5

Hi @jsoriano,

I cannot reproduce this issue, could you explain more about these logs? Do these errors appear always? Do they appear after a while?

  • This is a custom error log which we created for one of our application. Similar structured log will be keep generating while performing activities on our application. When ever a log is getting generated we will get this error in filebeat logs. The logs will be available in the log file for at-least an hour time.

One thing I have seen is that you are setting json.message_key to log, but there is no field named log. You only need to set this field if you want to apply some filetering or multiline rule on the message

  • We have the field log in the sample which i have shared ,

({ "Level":"DEBUG", "Date": "2018-06-04 20:11:24.277", "Thread": "[https-jsse-nio-8443-exec-9]", "Context": "RestProcessor", "Log": {"user":"","action":"Invoke Vault API","message":"Calling the vault end point [https://127.0.0.1:8200/v1/auth/token/revoke-self] using POST method","apiurl":"https://10.65.57.100/app/auth/tapp/revoke"} })

If this configuration is not sufficient to push to ES, could you please help me to generate configurations for pushing this log to ES?


(Jaime Soriano) #6

And the errors in filebeat, do they appear always or only sometimes or after a while? Is any of these messages shipped to Elasticsearch?

The json.messake_key option is case sensitive, so in this case it'd need to be Log, with uppercase L. In any case I think this is something you don't need because the log message seems already parsed, so you could remove this line from the configuration.


(Akash John) #7

And the errors in filebeat, do they appear always or only sometimes or after a while? Is any of these messages shipped to Elasticsearch?

I am trying to push only one sample log to ES and it is getting failed and I believe this is the same response for others as well because I tried it earlier.

The json.messake_key option is case sensitive, so in this case it'd need to be Log, with uppercase L. In any case I think this is something you don't need because the log message seems already parsed, so you could remove this line from the configuration.

I updated the configurations as you mentioned and no luck with that as well.

35%20PM

Log which I used to push to ES using filebeat

{"Level":"DEBUG","Date":"2018-06-05 17:52:33.419","Thread":"[https-jsse-nio-8443-exec-4]","Context":"VaultAuthController","Log":{"user":"user1","action":"User Login","message":"Authentication Successful","httpstatus":"200","apiurl":"https://dev-app.dnsname.com/app/auth/tapp/login/vault/auth/tvault/login"}}

Data which I can see in the ElasticSearch Index is

{"took":1,"timed_out":false,"_shards":{"total":5,"successful":5,"failed":0},"hits":{"total":7,"max_score":1,"hits":[{"_index":"aryan","_type":"doc","_id":"AWPRd8K41KLsrRyh9F3F","_score":1,"_source":{"@timestamp":"2018-06-05T19:41:15.705Z","source":"/root/sample.log","offset":20,"error":{"message":"Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {}","type":"json"},"Log":""" "Level":"DEBUG",""","prospector":{"type":"log"},"beat":{"name":"8c435b7cf226","hostname":"8c435b7cf226","version":"6.0.1"}}},{"_index":"aryan","_type":"doc","_id":"AWPRd8K41KLsrRyh9F3G","_score":1,"_source":{"@timestamp":"2018-06-05T19:41:15.705Z","source":"/root/sample.log","offset":56,"error":{"message":"Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {}","type":"json"},"Log":""" "Date": "2018-06-0517:52:33.419",""","prospector":{"type":"log"},"beat":{"name":"8c435b7cf226","hostname":"8c435b7cf226","version":"6.0.1"}}},{"_index":"aryan","_type":"doc","_id":"AWPRd8K41KLsrRyh9F3I","_score":1,"_source":{"@timestamp":"2018-06-05T19:41:15.705Z","Log":""" "Context": "appAuthController",""","prospector":{"type":"log"},"beat":{"name":"8c435b7cf226","hostname":"8c435b7cf226","version":"6.0.1"},"source":"/root/sample.log","offset":135,"error":{"type":"json","message":"Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {}"}}},{"_index":"aryan","_type":"doc","_id":"AWPRd8K41KLsrRyh9F3J","_score":1,"_source":{"@timestamp":"2018-06-05T19:41:15.705Z","beat":{"version":"6.0.1","name":"8c435b7cf226","hostname":"8c435b7cf226"},"error":{"message":"Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {}","type":"json"},"source":"/root/sample.log","offset":318,"Log":""" "Log": {"user":"user1","action":"UserLogin","message":"AuthenticationSuccessful","httpstatus":"200","apiurl":"https://dev-app.dnsname.com/app/auth/tapp/login"}""","prospector":{"type":"log"}}},{"_index":"aryan","_type":"doc","_id":"AWPRd8K41KLsrRyh9F3E","_score":1,"_source":{"@timestamp":"2018-06-05T19:41:15.705Z","error":{"message":"Error decoding JSON: unexpected EOF","type":"json"},"prospector":{"type":"log"},"beat":{"name":"8c435b7cf226","hostname":"8c435b7cf226","version":"6.0.1"},"Log":"{","source":"/root/sample.log","offset":2}},{"_index":"aryan","_type":"doc","_id":"AWPRd8K51KLsrRyh9F3K","_score":1,"_source":{"@timestamp":"2018-06-05T19:41:15.705Z","offset":320,"Log":"}","error":{"type":"json","message":"Error decoding JSON: invalid character '}' looking for beginning of value"},"source":"/root/sample.log","prospector":{"type":"log"},"beat":{"name":"8c435b7cf226","hostname":"8c435b7cf226","version":"6.0.1"}}},{"_index":"aryan","_type":"doc","_id":"AWPRd8K41KLsrRyh9F3H","_score":1,"_source":{"@timestamp":"2018-06-05T19:41:15.705Z","source":"/root/sample.log","offset":100,"error":{"message":"Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {}","type":"json"},"Log":""" "Thread": "[https-jsse-nio-8443-exec-4]",""","prospector":{"type":"log"},"beat":{"version":"6.0.1","name":"8c435b7cf226","hostname":"8c435b7cf226"}}}]}}

Could you please let me know if we need to modify any other configurations?

Please note that there are some other filebeat instances also running from the same server, which are pushing different logs to ElasticSearch with out any issue.


(Akash John) #8

Hi @jsoriano,

It was the issue of the log structure. If I am putting a json in multilines as shown below, we will get the mentioned error,

00%20PM

If we are putting the entire logs in a single line, filebeat will parse and push to ES. This was mentioned on the documentation (the JSON decoding only works if there is one JSON object per line) but I missed it.

In short we can mark this ticket as closed.


(Jaime Soriano) #9

Oh, I see, nice to hear you found the problem :slight_smile:

Was there any reason to log multiple json objects per line?


(Akash John) #10

No, our dev team wrote the code in that way. There is no other reason for that.


(system) #11

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.