ElasticSearch, NetBeans, Maven and Java

G'Day.

I'm trying to write a Java program that interacts with ElasticSearch and the provided documentation on the Java API has been less than helpful - see https://www.elastic.co/guide/en/elasticsearch/client/java-api/current/_maven_repository.html

This has proven to be far more trouble than it should have been, so a little feedback and thanks to the various sites I reviewed trying to work out how to get it going...

The set up is quite simple - a new Windows 10 box with NetBeans and Java freshly installed.

Followed the instructions with NetBeans and the first couple of attempts to set it up failed. It seems to assume quite a bit of knowledge about both Maven and NetBeans. As I was also using this as a chance to learn NetBeans, that knowledge was somewhat lacking on my part...

So what did work:

Create a new NetBeans project - A Maven project for a Java application (the jury's still out on whether a plain Java application is ok or it has to be an Enterprise Java application).

Edit the pom.xml and add some dependencies inside the defined project:

<dependencies>
    <dependency>
        <groupId>org.elasticsearch</groupId>
        <artifactId>elasticsearch</artifactId>
        <version>5.5.2</version>
        <type>jar</type>
    </dependency>
    <dependency>
        <groupId>org.elasticsearch.client</groupId>
        <artifactId>transport</artifactId>
        <version>5.5.2</version>
        <type>jar</type>
    </dependency>
    <dependency>
        <groupId>org.apache.logging.log4j</groupId>
        <artifactId>log4j-core</artifactId>
        <version>2.8.2</version>
    </dependency>
</dependencies>

As I get further in, there may be more dependencies that are required. I had hoped the blanket elasticsearch dependency would pull all the others in, but apparently not. :disappointed_relieved:

Build - it should go off and download get some stuff from Maven. This will pull in the standard Apache logger.

To create the log4j2.properties file, you need to use 'Source Packages' -> 'New' -> 'Folder' and tell it to make one called resources under src/main (default is src/main/java). Once you've done that you need to use 'Resources' -> 'New' -> 'Properties File' to create the actual file. So far the cut and paste example from the doco seems to work ok. If you get this wrong it complains about not being able to find a log4j2 configuration file - the trick is that the resources directory has to be in the class path - which is achieved by adding it as a source folder rather than simply creating it.

Now, some code. Create a new Java class and give it a main method. Should build but do nothing at this point.

I pasted the following in line by line (more or less - the try/catch has to go in first) and let NetBeans add dependencies as it saw fit:

try {
    InetSocketTransportAddress cluster = new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300);
    System.out.println("Address is :" + cluster);
    TransportClient client = new PreBuiltTransportClient(Settings.EMPTY);
    System.out.println("Client is :" + client);
} catch (Exception e) {
    e.printStackTrace();
    Logger.getLogger(Phant1.class.getName()).log(Level.SEVERE, null, e);
}    

...and it actually builds and runs. Which is more than my first 3 or 4 attempts at creating the project did. Phant1 in the above is my class name.

Ok, forgot to link the address to the client in the above. Then I couldn't get it working with the default clustername. Eventually ended up with:

try {
    // Set up for connectivity to elastic

    InetSocketTransportAddress cluster = new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300);
    System.out.println("Address is :" + cluster);
    
    Settings my_settings = Settings.builder()
            .put("cluster.name", cluster_name)
            .build();

    System.out.println("Cluster name is:" + cluster_name);
    System.out.println("Cluster name is:" + my_settings.get("cluster.name", "blank"));
    
    TransportClient client = new PreBuiltTransportClient(my_settings);            
    System.out.println("Client is :" + client);
    
    // Tell the client where the cluster is:
    
    client.addTransportAddress(cluster);
    
    // Send to Elastic

    IndexResponse response = client.prepareIndex(index_name,type_name)
        .setSource(jstring)
        .get();
    
    System.out.println(response);
    
    client.close();
} catch (Exception e) {
    e.printStackTrace();
    Logger.getLogger(Phant1.class.getName()).log(Level.SEVERE, null, e);
}    

...and had to edit elasticsearch's config/elasticsearch.yml file to specify both a cluster name and a node name.

Can I just say that the error message if you get them wrong isn't very friendly:

Client is :org.elasticsearch.transport.client.PreBuiltTransportClient@5767b2af
node {#transport#-1}{WIGQWUG8TcqrlCa2MkWmBw}{localhost}{127.0.0.1:9300} not part of the cluster Cluster [mik-phant], ignoring...

So now it'll post a record into an index.

Next catch was that Kibana/elastic wasn't recognizing the postDate field as a date - this is still working with elastics twitter example.

Changed the value of the postData field to be Date.getTime(), but Elastic/Kibana wouldn't recognize it as a dat until I added a mapping for it:

PUT phant-test1 
{
    "settings" : {
        "number_of_shards" : 2
    },
    "mappings" : {
        "test1" : {
            "properties" : {
                "user" : { "type" : "text" },
                "message" : { "type" : "text" },
                "postDate" : { "type" : "date" }
            }
        }
    }
}

This needs pasting into the Kibana Dev Tools Console. The response to a successful run is simply a yes. Surprisingly asking for the console version from the manual launched my local Kibana console with the text loaded.

The 'copy cURL' version from the manual didn't work. After pasting into notepad and editing it was spread over multiple lines and even when stuck on one line it objected to some of the ' characters and the Content-Type field. (Windows 10, latest cURL).

Interesting that the Kibana Dev Tools panel works even without any index patterns set up.

Before I could get the mapping to work, I had to delete the index -

curl -XDELETE http://localhost:9200/phant-test1

May I recommend that you don't invest too much time on the Transport Client as it will be anytime soon deprecated?

You can give a try to the Rest Client and even better to the High Level Rest client.

I have a full version working here:

HTH

Thanks for the heads up. Right now I just need something that works so I can focus on the application development side. I've already spent way longer on basic comms than I wanted to.

This commit uses the TransportClient instead: https://github.com/dadoonet/legacy-search/tree/0ce4571607ae97be217321f59c53d382ef85dc9d

May be this can help you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.