In Logstash, how do I limit the depth of JSON properties in my logs that are turned into Index fields in Elasticsearch?

I'm fairly new to the Elastic Stack. I'm using Logstash 6.4.0 to load JSON log data from Filebeat 6.4.0 into Elasticsearch 6.4.0.. I'm finding that I'm getting way too many JSON properties converted into fields once I start using Kibana 6.4.0.

I know this because when I navigate to Kibana Discover and put in my index of logstash-*, I'm getting an error message that states:

Discover: Trying to retrieve too many docvalue_fields. Must be less than or equal to: [100] but was [106]. This limit can be set by changing the [index.max_docvalue_fields_search] index level setting.

If I navigate to Management > Kibana > Index Patterns I see that I have 940 fields. It appears that each child property of my root JSON object (and many of those child properties have JSON objects as values, and so on) is automatically being parsed and used to create fields in my Elasticsearch logstash-* index.

So here's my question – how can I limit this automatic creation? Is it possible to do this by property depth? Is it possible to do this some other way?

Here is my Filebeat configuration (minus the comments):

filebeat.inputs:
- type: log
  enabled: true
  paths:
  - d:/clients/company-here/rpms/logs/rpmsdev/*.json
  json.keys_under_root: true
  json.add_error_key: true

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false

setup.template.settings:
  index.number_of_shards: 3

setup.kibana:

output.logstash:
  hosts: ["localhost:5044"]

Here is my current Logstash pipeline configuration:

input {
    beats {
        port => "5044"
    }
}
filter {
    date {
        match => [ "@timestamp" , "ISO8601"]
    }
}
output {
    stdout { 
        #codec => rubydebug 
    }
    elasticsearch {
        hosts => [ "localhost:9200" ]
    }
}

Here is an example of a single log message that I am shipping (one row of my log file) – note that the JSON is completely dynamic and can change depending on what's being logged:

{
    "@timestamp": "2018-09-06T14:29:32.128",
    "level": "ERROR",
    "logger": "RPMS.WebAPI.Filters.LogExceptionAttribute",
    "message": "Log Exception: RPMS.WebAPI.Entities.LogAction",
    "eventProperties": {
        "logAction": {
            "logActionId": 26268916,
            "performedByUserId": "b36778be-6181-4b69-a0fe-e3a975ddcdd7",
            "performedByUserName": "test.sga.danny@domain.net",
            "performedByFullName": "Mike Manley",
            "controller": "RpmsToMainframeOperations",
            "action": "UpdateStoreItemPricing",
            "actionDescription": "Exception while updating store item pricing for store item with storeItemId: 146926.",
            "url": "http://localhost:49399/api/RpmsToMainframeOperations/UpdateStoreItemPricing/146926",
            "verb": "PUT",
            "statusCode": 500,
            "status": "Internal Server Error - Exception",
            "request": {
                "itemId": 648,
                "storeId": 13,
                "storeItemId": 146926,
                "changeType": "price",
                "book": "C",
                "srpCode": "",
                "multi": 0,
                "price": "1.27",
                "percent": 40,
                "keepPercent": false,
                "keepSrp": false
            },
            "response": {
                "exception": {
                    "ClassName": "System.Net.Http.HttpRequestException",
                    "Message": "An error occurred while sending the request.",
                    "Data": null,
                    "InnerException": {
                        "ClassName": "System.Net.WebException",
                        "Message": "Unable to connect to the remote server",
                        "Data": null,
                        "InnerException": {
                            "NativeErrorCode": 10060,
                            "ClassName": "System.Net.Sockets.SocketException",
                            "Message": "A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond",
                            "Data": null,
                            "InnerException": null,
                            "HelpURL": null,
                            "StackTraceString": "[Truncated for brevity]",
                            "RemoteStackTraceString": null,
                            "RemoteStackIndex": 0,
                            "HResult": -2147467259,
                            "Source": "System",
                            "WatsonBuckets": null
                        },
                        "HelpURL": null,
                        "StackTraceString": "[Truncated for brevity]",
                        "RemoteStackTraceString": null,
                        "RemoteStackIndex": 0,
                        "HResult": -2146233079,
                        "Source": "System",
                        "WatsonBuckets": null
                    },
                    "HelpURL": null,
                    "StackTraceString": "[Truncated for brevity]",
                    "RemoteStackTraceString": null,
                    "RemoteStackIndex": 0,
                    "HResult": -2146233088,
                    "Source": "mscorlib",
                    "WatsonBuckets": null,
                    "SafeSerializationManager": {
                        "m_serializedStates": [{

                        }]
                    },
                    "CLR_SafeSerializationManager_RealType": "System.Net.Http.HttpRequestException, System.Net.Http, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"
                }
            },
            "performedAt": "2018-09-06T14:29:32.1195316-05:00"
        }
    },
    "logAction": "RPMS.WebAPI.Entities.LogAction"
}

See my solution at: https://stackoverflow.com/questions/52212228/in-logstash-how-do-i-limit-the-depth-of-json-properties-in-my-logs-that-are-tur

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.