NullPointerException Creating hdfs repo because SSL

Hi All.

I am trying to configure elasticsearch hadoop repository against kerberized and SSL protected HDFS.

I can successfully autenticate against kerberos, but is giving me this error creating a repository:

[2017-06-06 03:37:19,882][INFO ][org.apache.hadoop.security.UserGroupInformation] Login successful for user elasticsearch/ using keytab file /tmp/elastic.ktb
[2017-06-06 03:37:20,130][ERROR][org.apache.hadoop.hdfs.DFSClient] Failed to close inode 796273
java.io.IOException: DataStreamer Exception:
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:563)
Caused by: java.lang.NullPointerException
at org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:133)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:490)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:299)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:242)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:211)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:183)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1289)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1237)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:449)
[2017-06-06 03:37:20,131][INFO ][repositories ] [lnx-eus2dw-6892890b.bcidatalake.cl] update repository [diego_hdfs_repository]
[2017-06-06 03:37:20,174][WARN ][org.apache.hadoop.hdfs.DFSClient] DataStreamer Exception
java.lang.NullPointerException
at org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:133)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:490)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:299)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:242)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:211)
at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:183)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1289)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1237)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:449)

Any ideas on how to create/configure a hdfs repository using SSL? Thanks.

Might be related to this? https://github.com/elastic/elasticsearch/issues/25022

Hi James. Thanks for answer. I think is not the same case. In the link you provided the error is no_class_def_found_error, but in my case is NullPointerException. I tried to put the missing jar in the repo folder without success.

I think is because i dont know how to configure the SSL certificates for hdfs repository. Can you give me some light in how can i do it? thanks for your help. ( btw the elasticsearch version i am using is 2.4.3 )

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.