[2022-10-17T15:26:29,496][DEBUG][o.e.r.h.HdfsRepository ] [bs-dp-aidebugger-dev-001] Adding configuration to HDFS Client Configuration : dfs.replication = 1
[2022-10-17T15:26:29,557][DEBUG][o.a.h.s.Groups ] [bs-dp-aidebugger-dev-001] Creating new Groups object
[2022-10-17T15:26:29,560][DEBUG][o.a.h.u.NativeCodeLoader ] [bs-dp-aidebugger-dev-001] Trying to load the custom-built native-hadoop library...
[2022-10-17T15:26:29,560][DEBUG][o.a.h.u.NativeCodeLoader ] [bs-dp-aidebugger-dev-001] Failed to load native-hadoop with error: java.security.AccessControlException: access denied ("java.lang.RuntimePermission" "loadLibrary.hadoop")
[2022-10-17T15:26:29,560][DEBUG][o.a.h.u.NativeCodeLoader ] [bs-dp-aidebugger-dev-001] java.library.path=/usr/java/packages/lib:/usr/lib64:/lib64:/lib:/usr/lib
[2022-10-17T15:26:29,561][WARN ][o.a.h.u.NativeCodeLoader ] [bs-dp-aidebugger-dev-001] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[2022-10-17T15:26:29,562][DEBUG][o.a.h.u.PerformanceAdvisory] [bs-dp-aidebugger-dev-001] Falling back to shell based
[2022-10-17T15:26:29,564][DEBUG][o.a.h.s.JniBasedUnixGroupsMappingWithFallback] [bs-dp-aidebugger-dev-001] Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
[2022-10-17T15:26:29,672][DEBUG][o.a.h.s.Groups ] [bs-dp-aidebugger-dev-001] Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
[2022-10-17T15:26:29,672][DEBUG][o.e.r.h.HdfsRepository ] [bs-dp-aidebugger-dev-001] Hadoop security enabled: [false]
[2022-10-17T15:26:29,672][DEBUG][o.e.r.h.HdfsRepository ] [bs-dp-aidebugger-dev-001] Using Hadoop authentication method: [SIMPLE]
[2022-10-17T15:26:29,724][DEBUG][o.a.h.s.UserGroupInformation] [bs-dp-aidebugger-dev-001] Hadoop login
[2022-10-17T15:26:29,725][DEBUG][o.a.h.s.UserGroupInformation] [bs-dp-aidebugger-dev-001] hadoop login commit
[2022-10-17T15:26:29,727][DEBUG][o.a.h.s.UserGroupInformation] [bs-dp-aidebugger-dev-001] Using local user: UnixPrincipal: elasticsearch
[2022-10-17T15:26:29,727][DEBUG][o.a.h.s.UserGroupInformation] [bs-dp-aidebugger-dev-001] Using user: "UnixPrincipal: elasticsearch" with name: elasticsearch
[2022-10-17T15:26:29,727][DEBUG][o.a.h.s.UserGroupInformation] [bs-dp-aidebugger-dev-001] User entry: "elasticsearch"
[2022-10-17T15:26:29,728][DEBUG][o.a.h.s.UserGroupInformation] [bs-dp-aidebugger-dev-001] UGI loginUser: elasticsearch (auth:SIMPLE)
[2022-10-17T15:26:29,742][DEBUG][o.a.h.s.UserGroupInformation] [bs-dp-aidebugger-dev-001] PrivilegedAction [as: elasticsearch (auth:SIMPLE)][action: org.elasticsearch.repositories.hdfs.HdfsRepository$$Lambda$7500/0x0000000801d45150@40ba14a5]
java.lang.Exception: null
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1852) [hadoop-client-api-7.17.5.jar:7.17.5]
at org.elasticsearch.repositories.hdfs.HdfsRepository.createBlobstore(HdfsRepository.java:136) [repository-hdfs-7.17.5.jar:7.17.5]
at org.elasticsearch.repositories.hdfs.HdfsRepository.lambda$createBlobStore$1(HdfsRepository.java:247) [repository-hdfs-7.17.5.jar:7.17.5]
at java.security.AccessController.doPrivileged(AccessController.java:318) [?:?]
at org.elasticsearch.repositories.hdfs.HdfsRepository.createBlobStore(HdfsRepository.java:246) [repository-hdfs-7.17.5.jar:7.17.5]
at org.elasticsearch.repositories.hdfs.HdfsRepository.createBlobStore(HdfsRepository.java:44) [repository-hdfs-7.17.5.jar:7.17.5]
at org.elasticsearch.repositories.blobstore.BlobStoreRepository.blobStore(BlobStoreRepository.java:746) [elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.repositories.blobstore.BlobStoreRepository.verify(BlobStoreRepository.java:3217) [elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.repositories.VerifyNodeRepositoryAction.doVerify(VerifyNodeRepositoryAction.java:130) [elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.repositories.VerifyNodeRepositoryAction.access$400(VerifyNodeRepositoryAction.java:37) [elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.repositories.VerifyNodeRepositoryAction$VerifyNodeRepositoryRequestHandler.messageReceived(VerifyNodeRepositoryAction.java:162) [elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.repositories.VerifyNodeRepositoryAction$VerifyNodeRepositoryRequestHandler.messageReceived(VerifyNodeRepositoryAction.java:157) [elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.xpack.security.transport.SecurityServerTransportInterceptor$ProfileSecuredRequestHandler$1.doRun(SecurityServerTransportInterceptor.java:341) [x-pack-security-7.17.5.jar:7.17.5]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:26) [elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.xpack.security.transport.SecurityServerTransportInterceptor$ProfileSecuredRequestHandler$3.onResponse(SecurityServerTransportInterceptor.java:404) [x-pack-security-7.17.5.jar:7.17.5]
at org.elasticsearch.xpack.security.transport.SecurityServerTransportInterceptor$ProfileSecuredRequestHandler$3.onResponse(SecurityServerTransportInterceptor.java:394) [x-pack-security-7.17.5.jar:7.17.5]
at org.elasticsearch.xpack.security.authz.AuthorizationService.authorizeSystemUser(AuthorizationService.java:620) [x-pack-security-7.17.5.jar:7.17.5]
at org.elasticsearch.xpack.security.authz.AuthorizationService.authorize(AuthorizationService.java:250) [x-pack-security-7.17.5.jar:7.17.5]
at org.elasticsearch.xpack.security.transport.ServerTransportFilter$NodeProfile.lambda$inbound$1(ServerTransportFilter.java:136) [x-pack-security-7.17.5.jar:7.17.5]
at org.elasticsearch.action.ActionListener$1.onResponse(ActionListener.java:136) [elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.action.ActionListener$MappedActionListener.onResponse(ActionListener.java:101) [elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.xpack.security.authc.AuthenticatorChain.authenticateAsync(AuthenticatorChain.java:102) [x-pack-security-7.17.5.jar:7.17.5]
at org.elasticsearch.xpack.security.authc.AuthenticationService.authenticate(AuthenticationService.java:199) [x-pack-security-7.17.5.jar:7.17.5]
at org.elasticsearch.xpack.security.transport.ServerTransportFilter$NodeProfile.inbound(ServerTransportFilter.java:128) [x-pack-security-7.17.5.jar:7.17.5]
at org.elasticsearch.xpack.security.transport.SecurityServerTransportInterceptor$ProfileSecuredRequestHandler.messageReceived(SecurityServerTransportInterceptor.java:415) [x-pack-security-7.17.5.jar:7.17.5]
at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:67) [elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.transport.InboundHandler$1.doRun(InboundHandler.java:260) [elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:777) [elasticsearch-7.17.5.jar:7.17.5]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:26) [elasticsearch-7.17.5.jar:7.17.5]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
[2022-10-17T15:26:29,766][DEBUG][o.e.x.s.a.e.ReservedRealm] [bs-dp-aidebugger-dev-001] realm [reserved] authenticated user [elastic], with roles [[superuser]] (cached)
[2022-10-17T15:26:29,766][DEBUG][o.e.x.s.a.RealmsAuthenticator] [bs-dp-aidebugger-dev-001] Authentication of [elastic] using realm [reserved/reserved] with token [UsernamePasswordToken] was [AuthenticationResult{status=SUCCESS, user=User[username=elastic,roles=[superuser],fullName=null,email=null,metadata={_reserved=true}], message=null, exception=null}]
[2022-10-17T15:26:29,924][DEBUG][o.a.h.c.Tracer ] [bs-dp-aidebugger-dev-001] sampler.classes = ; loaded no samplers
[2022-10-17T15:26:29,931][DEBUG][o.a.h.c.Tracer ] [bs-dp-aidebugger-dev-001] span.receiver.classes = ; loaded no span receivers
[2022-10-17T15:26:29,959][DEBUG][o.a.h.h.c.i.DfsClientConf] [bs-dp-aidebugger-dev-001] dfs.client.use.legacy.blockreader.local = false
[2022-10-17T15:26:29,960][DEBUG][o.a.h.h.c.i.DfsClientConf] [bs-dp-aidebugger-dev-001] dfs.client.read.shortcircuit = false
[2022-10-17T15:26:29,960][DEBUG][o.a.h.h.c.i.DfsClientConf] [bs-dp-aidebugger-dev-001] dfs.client.domain.socket.data.traffic = false
[2022-10-17T15:26:29,961][DEBUG][o.a.h.h.c.i.DfsClientConf] [bs-dp-aidebugger-dev-001] dfs.domain.socket.path =
[2022-10-17T15:26:29,981][DEBUG][o.a.h.h.DFSClient ] [bs-dp-aidebugger-dev-001] Sets dfs.client.block.write.replace-datanode-on-failure.min-replication to 0
[2022-10-17T15:26:30,008][DEBUG][o.a.h.i.r.RetryUtils ] [bs-dp-aidebugger-dev-001] multipleLinearRandomRetry = null
[2022-10-17T15:26:30,034][DEBUG][o.a.h.i.Server ] [bs-dp-aidebugger-dev-001] rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine2$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker@3126f543
[2022-10-17T15:26:30,042][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] getting client out of cache: Client-83f5c34ed6f247bab5bfa3dfcba5ccb0
[2022-10-17T15:26:30,252][DEBUG][o.e.x.s.a.e.ReservedRealm] [bs-dp-aidebugger-dev-001] realm [reserved] authenticated user [elastic], with roles [[superuser]] (cached)
[2022-10-17T15:26:30,253][DEBUG][o.e.x.s.a.RealmsAuthenticator] [bs-dp-aidebugger-dev-001] Authentication of [elastic] using realm [reserved/reserved] with token [UsernamePasswordToken] was [AuthenticationResult{status=SUCCESS, user=User[username=elastic,roles=[superuser],fullName=null,email=null,metadata={_reserved=true}], message=null, exception=null}]
[2022-10-17T15:26:30,254][DEBUG][o.e.r.TransportUpdateByQueryAction] [bs-dp-aidebugger-dev-001] [11058]: starting
[2022-10-17T15:26:30,254][DEBUG][o.e.r.TransportUpdateByQueryAction] [bs-dp-aidebugger-dev-001] executing initial scroll against [.kibana_task_manager]
[2022-10-17T15:26:30,282][DEBUG][o.e.r.TransportUpdateByQueryAction] [bs-dp-aidebugger-dev-001] [11058]: got scroll response with [0] hits
[2022-10-17T15:26:30,283][DEBUG][o.e.i.r.WorkerBulkByScrollTaskState] [bs-dp-aidebugger-dev-001] [11058]: preparing bulk request for [0s]
[2022-10-17T15:26:30,283][DEBUG][o.e.r.TransportUpdateByQueryAction] [bs-dp-aidebugger-dev-001] [11058]: preparing bulk request
[2022-10-17T15:26:30,283][DEBUG][o.e.r.TransportUpdateByQueryAction] [bs-dp-aidebugger-dev-001] [11058]: finishing without any catastrophic failures
[2022-10-17T15:26:30,284][DEBUG][o.e.r.TransportUpdateByQueryAction] [bs-dp-aidebugger-dev-001] Freed [1] contexts
[2022-10-17T15:26:30,294][DEBUG][o.e.x.s.a.e.ReservedRealm] [bs-dp-aidebugger-dev-001] realm [reserved] authenticated user [elastic], with roles [[superuser]] (cached)
[2022-10-17T15:26:30,295][DEBUG][o.e.x.s.a.RealmsAuthenticator] [bs-dp-aidebugger-dev-001] Authentication of [elastic] using realm [reserved/reserved] with token [UsernamePasswordToken] was [AuthenticationResult{status=SUCCESS, user=User[username=elastic,roles=[superuser],fullName=null,email=null,metadata={_reserved=true}], message=null, exception=null}]
[2022-10-17T15:26:30,296][DEBUG][o.e.r.TransportUpdateByQueryAction] [bs-dp-aidebugger-dev-001] [11062]: starting
[2022-10-17T15:26:30,296][DEBUG][o.e.r.TransportUpdateByQueryAction] [bs-dp-aidebugger-dev-001] executing initial scroll against [.kibana_task_manager]
[2022-10-17T15:26:30,329][DEBUG][o.e.r.TransportUpdateByQueryAction] [bs-dp-aidebugger-dev-001] [11062]: got scroll response with [0] hits
[2022-10-17T15:26:30,329][DEBUG][o.e.i.r.WorkerBulkByScrollTaskState] [bs-dp-aidebugger-dev-001] [11062]: preparing bulk request for [0s]
[2022-10-17T15:26:30,330][DEBUG][o.e.r.TransportUpdateByQueryAction] [bs-dp-aidebugger-dev-001] [11062]: preparing bulk request
[2022-10-17T15:26:30,330][DEBUG][o.e.r.TransportUpdateByQueryAction] [bs-dp-aidebugger-dev-001] [11062]: finishing without any catastrophic failures
[2022-10-17T15:26:30,340][DEBUG][o.e.r.TransportUpdateByQueryAction] [bs-dp-aidebugger-dev-001] Freed [1] contexts
[2022-10-17T15:26:30,554][DEBUG][o.a.h.u.PerformanceAdvisory] [bs-dp-aidebugger-dev-001] Both short-circuit local reads and UNIX domain socket are disabled.
[2022-10-17T15:26:30,565][DEBUG][o.a.h.h.p.d.s.DataTransferSaslUtil] [bs-dp-aidebugger-dev-001] DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
[2022-10-17T15:26:30,571][DEBUG][o.e.r.h.HdfsRepository ] [bs-dp-aidebugger-dev-001] Using file-system [org.apache.hadoop.fs.Hdfs@fc403ee8] for URI [hdfs://10.0.100.7:8020], path [/test20221017]
[2022-10-17T15:26:30,576][DEBUG][o.a.h.h.DFSClient ] [bs-dp-aidebugger-dev-001] /test20221017: masked={ masked: rwxr-xr-x, unmasked: rwxrwxrwx }
[2022-10-17T15:26:30,649][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] The ping interval is 60000 ms.
[2022-10-17T15:26:30,663][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] Connecting to /10.0.100.7:8020
[2022-10-17T15:26:30,663][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] Setup connection to /10.0.100.7:8020
[2022-10-17T15:26:30,701][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch: starting, having connections 1
[2022-10-17T15:26:30,708][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch sending #0 org.apache.hadoop.hdfs.protocol.ClientProtocol.mkdirs
[2022-10-17T15:26:30,713][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch got value #0
[2022-10-17T15:26:30,717][DEBUG][o.a.h.i.ProtobufRpcEngine2] [bs-dp-aidebugger-dev-001] Call: mkdirs took 105ms
[2022-10-17T15:26:30,732][DEBUG][o.a.h.h.DFSClient ] [bs-dp-aidebugger-dev-001] /test20221017/tests-Lbuz2P_xTBypXhiby5MWsw: masked={ masked: rwxr-xr-x, unmasked: rwxrwxrwx }
[2022-10-17T15:26:30,744][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch sending #1 org.apache.hadoop.hdfs.protocol.ClientProtocol.mkdirs
[2022-10-17T15:26:30,748][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch got value #1
[2022-10-17T15:26:30,751][DEBUG][o.a.h.i.ProtobufRpcEngine2] [bs-dp-aidebugger-dev-001] Call: mkdirs took 19ms
[2022-10-17T15:26:30,754][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch sending #2 org.apache.hadoop.hdfs.protocol.ClientProtocol.getServerDefaults
[2022-10-17T15:26:30,762][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch got value #2
[2022-10-17T15:26:30,765][DEBUG][o.a.h.i.ProtobufRpcEngine2] [bs-dp-aidebugger-dev-001] Call: getServerDefaults took 11ms
[2022-10-17T15:26:30,814][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch sending #3 org.apache.hadoop.hdfs.protocol.ClientProtocol.create
[2022-10-17T15:26:30,824][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch got value #3
[2022-10-17T15:26:30,825][DEBUG][o.a.h.i.ProtobufRpcEngine2] [bs-dp-aidebugger-dev-001] Call: create took 12ms
[2022-10-17T15:26:30,862][DEBUG][o.a.h.h.DFSClient ] [bs-dp-aidebugger-dev-001] computePacketChunkSize: src=/test20221017/tests-Lbuz2P_xTBypXhiby5MWsw/data-uKRo8rUxSjSDvTcOwUFF_w.dat, chunkSize=516, chunksPerPacket=126, packetSize=65016
[2022-10-17T15:26:30,891][DEBUG][o.a.h.h.DFSClient ] [bs-dp-aidebugger-dev-001] WriteChunk allocating new packet seqno=0, src=/test20221017/tests-Lbuz2P_xTBypXhiby5MWsw/data-uKRo8rUxSjSDvTcOwUFF_w.dat, packetSize=65016, chunksPerPacket=126, bytesCurBlock=0, DFSOutputStream:block==null
[2022-10-17T15:26:30,891][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] Queued packet seqno: 0 offsetInBlock: 0 lastPacketInBlock: false lastByteOffsetInBlock: 22, block==null
[2022-10-17T15:26:30,891][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] Queued packet seqno: 1 offsetInBlock: 22 lastPacketInBlock: true lastByteOffsetInBlock: 22, block==null
[2022-10-17T15:26:30,892][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] block==null waiting for ack for: 1
[2022-10-17T15:26:30,903][DEBUG][o.a.h.h.c.i.LeaseRenewer ] [bs-dp-aidebugger-dev-001] Lease renewer daemon for [DFSClient_NONMAPREDUCE_-1487133476_99] with renew id 1 started
[2022-10-17T15:26:30,904][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] stage=PIPELINE_SETUP_CREATE, block==null
[2022-10-17T15:26:30,904][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] Allocating new block: block==null
[2022-10-17T15:26:30,921][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch sending #4 org.apache.hadoop.hdfs.protocol.ClientProtocol.addBlock
[2022-10-17T15:26:30,932][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch got value #4
[2022-10-17T15:26:30,932][DEBUG][o.a.h.i.ProtobufRpcEngine2] [bs-dp-aidebugger-dev-001] Call: addBlock took 11ms
[2022-10-17T15:26:30,963][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] pipeline = [DatanodeInfoWithStorage[10.0.100.9:9866,DS-931adf5b-56cb-4a43-a127-2711091b6a54,DISK], DatanodeInfoWithStorage[10.0.100.8:9866,DS-f24718ad-595e-414e-bad7-3db9c9d11c8c,DISK]], blk_1073746913_6096
[2022-10-17T15:26:30,967][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] Connecting to datanode 10.0.100.9:9866
[2022-10-17T15:26:30,968][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] Send buf size 256768
[2022-10-17T15:26:30,969][DEBUG][o.a.h.h.p.d.s.SaslDataTransferClient] [bs-dp-aidebugger-dev-001] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
[2022-10-17T15:26:30,969][DEBUG][o.a.h.h.p.d.s.SaslDataTransferClient] [bs-dp-aidebugger-dev-001] SASL client skipping handshake in unsecured configuration for addr = /10.0.100.9, datanodeId = DatanodeInfoWithStorage[10.0.100.9:9866,DS-931adf5b-56cb-4a43-a127-2711091b6a54,DISK]
[2022-10-17T15:26:31,043][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] nodes [DatanodeInfoWithStorage[10.0.100.9:9866,DS-931adf5b-56cb-4a43-a127-2711091b6a54,DISK], DatanodeInfoWithStorage[10.0.100.8:9866,DS-f24718ad-595e-414e-bad7-3db9c9d11c8c,DISK]] storageTypes [DISK, DISK] storageIDs [DS-931adf5b-56cb-4a43-a127-2711091b6a54, DS-f24718ad-595e-414e-bad7-3db9c9d11c8c]
[2022-10-17T15:26:31,045][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] blk_1073746913_6096 sending packet seqno: 0 offsetInBlock: 0 lastPacketInBlock: false lastByteOffsetInBlock: 22
[2022-10-17T15:26:31,045][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] stage=DATA_STREAMING, blk_1073746913_6096
[2022-10-17T15:26:31,110][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] DFSClient seqno: 0 reply: SUCCESS reply: SUCCESS downstreamAckTimeNanos: 307670 flag: 0 flag: 0
[2022-10-17T15:26:31,116][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] blk_1073746913_6096 sending packet seqno: 1 offsetInBlock: 22 lastPacketInBlock: true lastByteOffsetInBlock: 22
[2022-10-17T15:26:31,121][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] DFSClient seqno: 1 reply: SUCCESS reply: SUCCESS downstreamAckTimeNanos: 948404 flag: 0 flag: 0
[2022-10-17T15:26:31,121][DEBUG][o.a.h.h.DataStreamer ] [bs-dp-aidebugger-dev-001] Closing old block BP-668373763-10.0.100.7-1662471357966:blk_1073746913_6096
[2022-10-17T15:26:31,127][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch sending #5 org.apache.hadoop.hdfs.protocol.ClientProtocol.complete
[2022-10-17T15:26:31,128][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch got value #5
[2022-10-17T15:26:31,129][DEBUG][o.a.h.i.ProtobufRpcEngine2] [bs-dp-aidebugger-dev-001] Call: complete took 2ms
[2022-10-17T15:26:31,135][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch sending #6 org.apache.hadoop.hdfs.protocol.ClientProtocol.getBlockLocations
[2022-10-17T15:26:31,136][DEBUG][o.a.h.i.Client ] [bs-dp-aidebugger-dev-001] IPC Client (1670595564) connection to /10.0.100.7:8020 from elasticsearch got value #6
[2022-10-17T15:26:31,136][DEBUG][o.a.h.i.ProtobufRpcEngine2] [bs-dp-aidebugger-dev-001] Call: getBlockLocations took 2ms
[2022-10-17T15:26:31,145][DEBUG][o.a.h.h.DFSClient ] [bs-dp-aidebugger-dev-001] newInfo = LocatedBlocks{; fileLength=22; underConstruction=false; blocks=[LocatedBlock{BP-668373763-10.0.100.7-1662471357966:blk_1073746910_6093; getBlockSize()=22; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[10.0.100.7:9866,DS-047a3410-6a5f-4381-b4e1-e931df946ec3,DISK], DatanodeInfoWithStorage[10.0.100.8:9866,DS-f24718ad-595e-414e-bad7-3db9c9d11c8c,DISK], DatanodeInfoWithStorage[10.0.100.9:9866,DS-931adf5b-56cb-4a43-a127-2711091b6a54,DISK]]; cachedLocs=[]}]; lastLocatedBlock=LocatedBlock{BP-668373763-10.0.100.7-1662471357966:blk_1073746910_6093; getBlockSize()=22; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[10.0.100.7:9866,DS-047a3410-6a5f-4381-b4e1-e931df946ec3,DISK], DatanodeInfoWithStorage[10.0.100.8:9866,DS-f24718ad-595e-414e-bad7-3db9c9d11c8c,DISK], DatanodeInfoWithStorage[10.0.100.9:9866,DS-931adf5b-56cb-4a43-a127-2711091b6a54,DISK]]; cachedLocs=[]}; isLastBlockComplete=true; ecPolicy=null}
[2022-10-17T15:26:31,148][DEBUG][o.a.h.h.DFSClient ] [bs-dp-aidebugger-dev-001] Connecting to datanode 10.0.100.7:9866
[2022-10-17T15:26:31,164][DEBUG][o.a.h.h.p.d.s.SaslDataTransferClient] [bs-dp-aidebugger-dev-001] SASL encryption trust check: localHostTrusted = false, remoteHostTrusted = false
[2022-10-17T15:26:31,165][DEBUG][o.a.h.h.p.d.s.SaslDataTransferClient] [bs-dp-aidebugger-dev-001] SASL client skipping handshake in unsecured configuration for addr = /10.0.100.7, datanodeId = DatanodeInfoWithStorage[10.0.100.7:9866,DS-047a3410-6a5f-4381-b4e1-e931df946ec3,DISK]
[2022-10-17T15:26:31,177][DEBUG][o.a.h.h.DFSClient ] [bs-dp-aidebugger-dev-001] DeadNode detection is not enabled or given block LocatedBlocks{; fileLength=22; underConstruction=false; blocks=[LocatedBlock{BP-668373763-10.0.100.7-1662471357966:blk_1073746910_6093; getBlockSize()=22; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[10.0.100.7:9866,DS-047a3410-6a5f-4381-b4e1-e931df946ec3,DISK], DatanodeInfoWithStorage[10.0.100.8:9866,DS-f24718ad-595e-414e-bad7-3db9c9d11c8c,DISK], DatanodeInfoWithStorage[10.0.100.9:9866,DS-931adf5b-56cb-4a43-a127-2711091b6a54,DISK]]; cachedLocs=[]}]; lastLocatedBlock=LocatedBlock{BP-668373763-10.0.100.7-1662471357966:blk_1073746910_6093; getBlockSize()=22; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[10.0.100.7:9866,DS-047a3410-6a5f-4381-b4e1-e931df946ec3,DISK], DatanodeInfoWithStorage[10.0.100.8:9866,DS-f24718ad-595e-414e-bad7-3db9c9d11c8c,DISK], DatanodeInfoWithStorage[10.0.100.9:9866,DS-931adf5b-56cb-4a43-a127-2711091b6a54,DISK]]; cachedLocs=[]}; isLastBlockComplete=true; ecPolicy=null} is null, skip to remove node.
[2022-10-17T15:26:31,178][DEBUG][o.a.h.h.DFSClient ] [bs-dp-aidebugger-dev-001] DFSInputStream has been closed already
[2022-10-17T15:26:31,178][DEBUG][o.a.h.h.DFSClient ] [bs-dp-aidebugger-dev-001] DeadNode detection is not enabled or given block LocatedBlocks{; fileLength=22; underConstruction=false; blocks=[LocatedBlock{BP-668373763-10.0.100.7-1662471357966:blk_1073746910_6093; getBlockSize()=22; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[10.0.100.7:9866,DS-047a3410-6a5f-4381-b4e1-e931df946ec3,DISK], DatanodeInfoWithStorage[10.0.100.8:9866,DS-f24718ad-595e-414e-bad7-3db9c9d11c8c,DISK], DatanodeInfoWithStorage[10.0.100.9:9866,DS-931adf5b-56cb-4a43-a127-2711091b6a54,DISK]]; cachedLocs=[]}]; lastLocatedBlock=LocatedBlock{BP-668373763-10.0.100.7-1662471357966:blk_1073746910_6093; getBlockSize()=22; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[10.0.100.7:9866,DS-047a3410-6a5f-4381-b4e1-e931df946ec3,DISK], DatanodeInfoWithStorage[10.0.100.8:9866,DS-f24718ad-595e-414e-bad7-3db9c9d11c8c,DISK], DatanodeInfoWithStorage[10.0.100.9:9866,DS-931adf5b-56cb-4a43-a127-2711091b6a54,DISK]]; cachedLocs=[]}; isLastBlockComplete=true; ecPolicy=null} is null, skip to remove node.
the log is for this command
PUT _snapshot/test
{
"type" : "hdfs",
"settings" : {
"path" : "/test20221017",
"conf" : {
"dfs" : {
"replication" : 1
}
},
"compress" : "true",
"uri" : "hdfs://10.0.100.7:8020"
}
}
and the /test20221017 in hdfs is not exitst before I creat the repository
hdfs fsck /test20221017
Connecting to namenode via http://node4:9870/fsck?ugi=elasticsearch&path=%2Ftest20221017
FSCK started by elasticsearch (auth:SIMPLE) from /10.0.100.7 for path /test20221017 at Mon Oct 17 15:43:07 CST 2022
Status: HEALTHY
Number of data-nodes: 3
Number of racks: 1
Total dirs: 1
Total symlinks: 0
Replicated Blocks:
Total size: 0 B
Total files: 0
Total blocks (validated): 0
Minimally replicated blocks: 0
Over-replicated blocks: 0
Under-replicated blocks: 0
Mis-replicated blocks: 0
Default replication factor: 3
Average block replication: 0.0
Missing blocks: 0
Corrupt blocks: 0
Missing replicas: 0
Blocks queued for replication: 0
Erasure Coded Block Groups:
Total size: 0 B
Total files: 0
Total block groups (validated): 0
Minimally erasure-coded block groups: 0
Over-erasure-coded block groups: 0
Under-erasure-coded block groups: 0
Unsatisfactory placement block groups: 0
Average block group size: 0.0
Missing block groups: 0
Corrupt block groups: 0
Missing internal blocks: 0
Blocks queued for replication: 0
FSCK ended at Mon Oct 17 15:43:07 CST 2022 in 2 milliseconds
The filesystem under path '/test20221017' is HEALTHY