Issues with converting parsed Logstash logs to Kibana - 6.8.2

Hey everyone,
We are having issues taking the parsed logs from Logstash and ingesting them into Kibana to produce tags and graphs. The error states "No cached mapping for this field. Refresh list from Management > Index Patterns page". When we click the refresh button, nothing happens. Could someone be able to point us in the right direction?

Thanks a ton!

Hi
Could you provide the JSON view of this record? and could ja export and provide the index pattern object can be done in Management/Saved Objects
Here you can filter by the name of the index pattern, select and export it

thanks!
Best,
Matthias

Hi Matthias,
Thanks for your reply. Here is the following JSON view followed by the index pattern object. We are using :logstash- for our indexing but, even if we add new fields, they do not show up (ie src_port)

The index pattern is too large to attach here. Let me know if there is a way to send it to you. Or, if you prefer, I can send some screenshots of it.

{
"_index": "so-dev:logstash-firewall-2019.11.28",
"_type": "doc",
"_id": "9w8msm4B4A1jhnxIuXwy",
"_version": 1,
"_score": null,
"_source": {
"dst_port": "xxxxx",
"dst_ip": "x.x.x.x",
"event_type": "firewall",
"action": "denied",
"hit_count": "1",
"message": "%ASA-6-106100: access-list 100 denied tcp outside/xxxx(xxx) -> inside/xxxxx(xxxxx) hit-cnt 1 300-second interval [0x40c998bc, 0x00000000]",
"interval": "300-second interval",
"tags": [
"syslogng",
"firewall",
"conf_file_1005"
],
"hashcode1": "0x40c998bc",
"@version": "1",
"syslog_facility": "user-level",
"syslog-legacy_msghdr": ": ",
"host": "gateway",
"logstash_time": 0.0016069412231445312,
"policy_id": "100",
"port": 45936,
"syslog-host_from": "xxxxx",
"syslog-facility": "local2",
"hashcode2": "0x00000000",
"protocol": "tcp",
"syslog_severity": "notice",
"src_interface": "outside",
"@timestamp": "2019-11-28T13:15:12.415Z",
"syslog-host": "xxxxx",
"syslog-priority": "info",
"src_port": "443",
"syslog-sourceip": "xxxxxx",
"syslog-tags": ".source.s_network",
"src_ip": "xxxxx",
"syslog_severity_code": 5,
"syslog_facility_code": 1,
"dst_interface": "inside"
},
"fields": {
"@timestamp": [
"2019-11-28T13:15:12.415Z"
]
},
"highlight": {
"event_type": [
"@kibana-highlighted-field@firewall@/kibana-highlighted-field@"
]
},
"sort": [
1574946912415
]
}

Thanks a ton!

thx, quick question, you're using cross cluster search, right?

Yes, we are.

I've heard a similar issue before, I'll do some research, I think there might be an issue with the index pattern refresh.

Excellent - I will wait on your reply.
Thank you so much!!

Sorry - I should mention that we have this installed using Security Onion.. That's probably pertinent information!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.