ElasticSearch issue:- Trimming the data from Message field

Hi All,

I am new to elastic stack. I have implemented ELK stack using docker. The other teams have started feeding there logs to my logstash. one team in particular feeding logs through Jenkins using Logstash plugin (https://wiki.jenkins.io/display/JENKINS/Logstash+Plugin and https://jenkins.io/doc/pipeline/steps/logstash/). I am reciving data on my Elastic stack. I have created a seperate indeces for Jenkins data, based on Host. below is the example
curl 'localhost:9200/_cat/indices?v' |grep jenkins*
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 3625 100 3625 0 0 20671 0 --:--:-- --:--:-- --:--:-- 20714
yellow open jenkins-2018.12.27 9Fnf-9kZRBmJNeVj0V9iLw 1 1 49 0 137.3kb 137.3kb
yellow open jenkins uwCNVIJRQA618bgSL0u7hQ 5 1 147 0 334.2kb 334.2kb
yellow open jenkins-2018.12.26 2zQLqP8fQXq7bTZPh0_wdw 5 1 49 0 98.8kb 98.8kb

Now the issue is the message feild of the data reccived from Jenkins has over 100K of strings (13MB), which makes querying on Kibana slow and it crashes.

I would need some suggestions how do i break the Messages data into smaller chunks so that querying becomes eaiser

https://discuss.elastic.co/uploads/short-url/qd1ZlVMLMVL2FaqzjG4gWnyNrwm.png

adding screen capture of my kibana error which is due to message feild having huge data

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.