I'm considering doing log analysis with td-agent and ELK. I have seen in the td-agent: . It says, mongodb is commonly for app log, and elastic-search+kibana is commonly for text-log.
I'm new to both mongo and ELK, could you give me general idea of the pros and cons for mongo vs ES?
especially the disadvantage of ELK for app log analysis.
Thank you in advance.
I followed the link you had there. I think the only perceivable difference between "app" logs and so-called "text" logs is how they've been inserted into Elasticsearch. If you do no tokenization of any kind, then yes, it will be a text search. However, using log parsers like Logstash (or even Treasure Data's fluentd), then your "text" logs become structured logs, which allows them to be stored "schema-less," which makes them no different from the so-called "app" logs.
This is the default behavior of the Elastic stack (we don't call it ELK any more because there are more parts than just Elasticsearch, Logstash, and Kibana). Use Beats and/or Logstash to ingest logs and tokenize them. Elasticsearch even has what's known as an Ingest Node now, and can do some limited parsing and tokenizing of logs itself. As such, Elasticsearch is on equal footing with the other "schema-less backends" while still maintaining all of the benefits of a full-text search engine.
Disadvantage? I don't see one.
Thank you Aaron. [quote="theuntergeek, post:2, topic:84478"]
As such, Elasticsearch is on equal footing with the other "schema-less backends" while still maintaining all of the benefits of a full-text search engine.
That's what I'm thinking about. Tokenize or not.
Always tokenize if you can. Logstash or ingest node are terrific choices for this.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.