Not able to view json data in available fields for message json

In my Kibana dashboard I have below json data

"message": "{ "time": "2021-06-15T08:57:20+00:00", "remote_addr": "", "the_real_ip": "10..2.3.4", "remote_user": "", "time_local": "15/Jun/2021:08:57:20 +0000", "body_bytes_sent": "1811", "request_time": "0.117", "upstream_response_time": "0.120", "status": "200", "request": "POST /api/HTTP/1.1", "request_method": "POST", "http_referer": "", "hostname": "api-controller", "http_user_agent": "", "proxy_protocol_addr": "", "x-forward-for": "", "request_id": "abc04c4a2d","bytes_sent": 2732,"vhost": "api", "request_proto": "HTTP/1.1" }",

I want above data of message to be in available fields column which is present at left side of Kibana dashboard . All fields like time , remote_addr,remote_user should be in available fields column/tab. Any help how can this be done ?

I have below configured in filbeat config map file :

- type: log
    - /var/lib/docker/containers/*/*.log
  json.message_key: message
  json.keys_under_root: true
    - add_kubernetes_metadata:
        in_cluster: true
    - decode_json_fields:
        fields: ["message"]
        target: "" 

But still not able to get all key values of message json in discover .


Could you share the mapping of your Data? Which version of Kibana are you using? Which fields are displayed in the sidebar?

Thx & Best,

Hi Matthias,

Kibana version is 7.3.2. The rest of fields which are visible are agentid , time etc.
None of fields of message json are visible In the sidebar.
The above data is than passed to ES . Please let me know if you need any more inputs.
Any help would be appreciated.

Can you show a sample document? Does the message field by any chance contain a string that contain a serialised JSON document?

Hi Christian,

I have updated my question does that help ? I have other fields as well apart from message like agent_id , time, stream but they all are visible in discover tab. Only message json key and its values are not visible in discover tab

As you can see the message field is indeed a string so that means the fields within that string have not been indexed and therefore does not show up in the sidebar.

okay so what should be the next step ? can you please guide over here

You will need to parse that information out before indexing it, e.g. through an index pipeline or Logstash. You will then need to run an update by query with an infest pipeline to update data already indexed.

I am new to this but will try any reference if you have please do share.
Thanks for the help.

Where is your data coming from? Filebeat?

yes from filbeat and than we pass it to ES . In my question I have shared a snippet which I was trying in filebeat config file but didn't got any output.

You could use the grok processor in an ingest pipeline for this case, have a look:

Here's more about ingest pipelines:

But of course, like @Christian_Dahlqvist mentioned, Logstash (Grok filter plugin | Logstash Reference [7.13] | Elastic) would also be an option, but of course, this would mean to install & use another service. while ingest pipelines are a part of Elasticsearch

Hi All,

I have created a local setup of ELK version 6.3.2. can anyone please help to fetch fields present in log json at left side i.e under available fields.
I want the fields present inside log json to be under available field section.
I have tried many things but yet no solution. Will attach all required logstash conf file and Filebeat prospector as well. Also the log file which I am parsing.
Can someone pls check and let me know if anything needs to be updated.

Logstash.conf file
input {
beats {
port => "5044"

filter {
json {
source => "message"

output {
elasticsearch {
hosts => [ "elasticsearch:9200" ]

File prospectors file
#This prospector captures the docker logs
- type: log
- /var/lib/docker/containers/test.log

test.log contents
{"log":"{ "time": "2021-06-17T08:40:04+00:00", "remote_addr": "", "the_real_ip": "", "remote_user": "", "time_local": "17/Jun/2021:08:40:04 +0000", "body_bytes_sent": "2809", "request_time": "0.007", "upstream_response_time": "0.008", "status": "200", "request": "GET /api/v1/p/prometheus HTTP/1.1", "request_method": "GET", "http_referer": "", "hostname": "api-controller-b7469b79-x9rzm", "http_user_agent": "Prometheus/2.13.0", "proxy_protocol_addr": "", "x-forward-for": ",", "request_id": "ttt","bytes_sent": 3927,"vhost": "", "request_proto": "HTTP/1.1", "path": "/api/v1/party/actuator/prometheus", "request_query": "", "request_length": 597, "duration": 0.007,"proxy_upstream_name": "pay-9000", "upstream_addr": "", "upstream_response_length": "13203", "upstream_response_time": "0.008", "upstream_status": "200", "http_x_forwarded_for": "" }\n","stream":"stdout","time":"2021-06-17T08:40:04.040356146Z"}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.