Removed logstash from ELK, now ELK's broken

(Steve) #1

I've setup an ELK server with a number of "beats" agents, versions as follows:-

ELK Server

# rpm -qa | grep -E "logstash|elastic"


# rpm -qa | grep beat

A couple of days ago I started getting timeout errors on the agents when they were trying to send messages to the ELK server. I read that the problem may be with logstash, so I've tried to remove the logstash from my pipeline so that the "beats" agents will send messages direct to elasticsearch. This seems to have sorted the errors.

My question is this, I had some custom grok patterns setup in logstash for my application logs. How do I replicate this and parse my logs direct in elastic search? Have I made a mess of things by removing logstash?

Any pointers would be most welcome.

Thanks, Steve.

(Magnus B├Ąck) #2

ES currently doesn't parse data. You need Logstash for that, but you can of course still send your Topbeat data directly to ES if you prefer.

(Steve) #3

Hi Magnus,
Thanks for the information (I've been out of the office for a few days - hence the delay in replying). I've checked Kibana, my TopBeat and PacketBeat based dashboards look ok - so unless there's a good reason to change back from ES to LS I will leave them as they are.

It looks like I don't have an option for FileBeat - I'll have to revert back to LS. If I get encounter the same messages again I'll start a new topic.

Thanks, Steve.

(system) #4