I have an open question.
I'm a student and I work in a projet of anomalies detection we can say.
I have choose to store my data in Elasticsearch. Currently I have 5/6 weeks of data ready.
But now I have a big problem, how analyse this data ?
My problem is connected with anomalies detection in time, like something happened during the night...
For this, my idea is to learn a pattern of times of my log. So maybe machine learning is the best in my case. (I never used)
I found too much possibility in google.
We have Spark with Mlib which provide machine learning,
We have Hadoop, DeepDetect etc etc....
If you have any experience with this problem, maybe you can give me some advices.
Which solution choose ?
I try to use the JAVA API of Elasticsearch, but for the moment I failled.
thank for all reply who can help me a little