Hey i have imported many logfiles into kibana and they all have the same index pattern
what i need to do is to create visualization by correlating these data like how a database table work
(primary key...) that means infos in the first file depends from the second log file. any idea how can i do it
without using elastic grapgh. because i need to create dashboards, pie, char bar visualizations.
Elasticsearch is not a database, it's a document store. Generally speaking, the way to work with document stores is to denormalize your data as much as possible, usually by replicating data that you would normally group up into different tables in a relational database.
The short answer is, you can't do what you want. The long answer is, tell me about your data and what you are trying to do with it and maybe I can help guide you to a better way to index it into Elasticsearch.
have two large csv files as showing below on for the database and one for a backup server. After filtering the cmdb file i need specify the name of the application (Parent_Nom) which will give me the name of the server where it is installed(Enfant_Nom). after that i will go to the tsm-server file look for the name of the server under (Node) and see its status,volume...
is it possible to visualize this in kibana ?
can lucene or DSL query do that? or should define a method in my logstasg config file?
it is like creating a foreign key to do relations between database tables
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.