While trying to insert into elastic search, this is the code I used. The code executes fine, but doesn't insert anything. When I put a debug point to see what went wrong, I get this error: Method threw 'java.lang.StackOverflowError' exception. Cannot evaluate org.elasticsearch.common.inject.InjectorImpl.toString().
Please note that I get to see this error only while debugging. Otherwise, the code executes fine.
Hi,
You need pass the settings, especially the cluster.name, to the constructor when creating the TransportClient. You get an NoNodeAvailableException otherwise. Rather than using the debugger, I'd suggest either looking at exceptions that are thrown (either just outputting them, or some sort of logging). Also looking at response.isCreated() in this particular case will tell you if indexing the document was successfull or not.
Cheers
I did try that now. My code now looks like this. Can you tell me if I'm missing something? But still I get the same error. I attached a screenshot of the error.
public class TestInsert {
public static void main(String[] args) {
String hostname="http:localhost";
Settings settings = ImmutableSettings.settingsBuilder().put("cluster.name", "my-cluster-name").build();
try{
Client client = new TransportClient(settings)
.addTransportAddress(new InetSocketTransportAddress(hostname, 9300));
IndexResponse response = client.prepareIndex("twitter", "tweet", "1")
.setSource(jsonBuilder()
.startObject()
.field("user", "kimchy")
.field("postDate", new Date())
.field("message", "trying out Elasticsearch")
.endObject()
)
.execute()
.actionGet();
System.out.println("Is the index created?");
System.out.println(response.isCreated());
}
catch (Exception e){}
}
The injector you are seeing in your debugger has nothing to do with you inserting a document into Elasticsearch. It seems it doesn't like to get printed, looks like calling toString() on it creates some infinite loop, but thats an issue with your IDE.
I used the "_cluster/health?pretty=true" to see the JSON response and picked up "cluster_name" from the response. I still don't get to know why I end up with a NoNodeAvailableException.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.