Viewing data not collected through logstash but rather independently shipped into elastic

Hi,

I am writing log-like data directly into elasticsearch, bypassing some of the conventional logging infrastructure, logstash included. This log-like data is being written entirely semantically as json, not as human readable text messages to be parsed, especially so for visualization and analytics sake.

How should I structure my json data messages such that they are available to Kibana and to the Kibana end-user, as equally as conventional log data ingested by logstash??

are there some official configuration knobs determining which data Kibana makes available from elastic?
Not sure I could locate such ones in https://www.elastic.co/guide/en/kibana/current/kibana-server-properties.html

Many Thanks!

You can view any sort of data in KB, irrespective of it being time based - logs/metrics etc. Just add in a new index pattern and you should be good to go.

Are you having problems?

Coming back to this few weeks old thread, I do have some questions standing in my way. Namely:
Querying my index in the HQ plugin, I see that all of these columns displayed there for it, are empty:

FieldFormatMap
TimeFieldName
IntervalName
Fields
Title
BuildNum

Whereas the column "Item" has the actual items.
My items are JSON objects containing a time field, a field I could call a title, and a field containing my application's build number that pushed the item into elastic.

What specific benefits am I missing by not having these fields within my items "mapped" to the aforementioned columns?

Many thanks,
Matan