I'm trying to use ELS in a JAVA application and do some machine learning after..
But I have some trouble, do you have some exemple how use ELS with the JAVA API ?
Do some search in java ? I search in google, try different things, but without result.
Thank if you can assist me a little for start it.
Vincent
Hi! Thank you, I tried this solution, but finally I chose Spark which provide an API for Elasticsearch too.
I just put some code of spark here, maybe it can help someone one day.
SparkConf conf = new SparkConf().setAppName("MySparkElas").setMaster("local");
conf.set("es.index.auto.create", "true");
conf.set("es.nodes", "172.26.167.204");
conf.set("es.port", "9200");
JavaSparkContext sc = new JavaSparkContext(conf);
String target ="logstash-*/Collector-ticketNEW";
JavaPairRDD<String, Map<String, Object>> esRDD2 = JavaEsSpark.esRDD(sc, target);
JavaRDD newRDD = esRDD2.map(x -> x._2);
JavaRDD<Ticket> tickets = newRDD.map(
new Function<Map<String, Object>, Ticket>() {
@Override
public Ticket call(Map<String, Object> o) throws Exception {
Ticket ticket = new Ticket();
//exemple of mapping, but this one is not very good:
String[] parts = o.toString().split(",");
Ticket ticket = new Ticket();
ticket.setIpAddress(parts[3]);
return ticket;
}
});
And why not, use SQL after
// Apply a schema to an RDD of JavaBeans and register it as a table.
SQLContext sqlContext = new SQLContext(sc);
DataFrame schemaTicket = sqlContext.createDataFrame(tickets,Ticket.class);
schemaTicket.registerTempTable("tickets");
DataFrame df = sqlContext.sql("SELECT * FROM tickets");
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.