hdfs ls command fails with server has invalid Kerberos principal
search cancel

hdfs ls command fails with server has invalid Kerberos principal

book

Article ID: 294704

calendar_today

Updated On:

Products

Services Suite

Issue/Introduction

Symptoms:

The command  hdfs -ls fails with the error message "Server has invalid Kerberos principal".

java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/[email protected]; Host Details : local host is: "sdw1.phd.com/10.181.22.130"; destination host is: "hdm1.phd.dev.com":8020;

at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)

at org.apache.hadoop.ipc.Client.call(Client.java:1351)

at org.apache.hadoop.ipc.Client.call(Client.java:1300)

at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)

at com.sun.proxy.$Proxy7.getFileInfo(Unknown Source)

at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:688)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)

at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)

at com.sun.proxy.$Proxy8.getFileInfo(Unknown Source)

at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1796)

at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1116)

at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1112)

at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)

at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1112)

at org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1701)

at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1647)

at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1622)

at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326)

at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:224)

at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:207)

at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:190)

at org.apache.hadoop.fs.shell.Command.run(Command.java:154)

at org.apache.hadoop.fs.FsShell.run(FsShell.java:255)

at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)

at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)

at org.apache.hadoop.fs.FsShell.main(FsShell.java:305)

Caused by: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/[email protected]

at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:620)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)

at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:583)

at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:667)

at org.apache.hadoop.ipc.Client$Connection.access$2600(Client.java:314)

at org.apache.hadoop.ipc.Client.getConnection(Client.java:1399)

at org.apache.hadoop.ipc.Client.call(Client.java:1318)

... 28 more

Caused by: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/[email protected]

at org.apache.hadoop.security.SaslRpcClient.getServerPrincipal(SaslRpcClient.java:325)

at org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:228)

at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:157)

at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:387)

at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:494)

at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:314)

at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:659)

at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:655)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)

at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)

... 31 more

ls: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hdfs/[email protected]; Host Details : local host is: "sdw1.phd.com/10.181.22.130"; destination host is: "hdm1.phd.dev.com":8020;

Environment


Cause

The HDFS client will first initiate an RPC call to the NameNode to get the HDFS service principal. Then the client with compare the hostname from the service principal to the canonical name of the NameNode hostname.


In this case, the NameNode canonical name on the client machine resolved to a different hostname than the hostname in the DNS.

java -classpath HadoopDNSVerifier-1.0.jar hadoop.troubleshooting.HadoopDNSVerifier.CheckRemote hdm4.phd.dev.com

IP:10.181.22.149 hostname:hdm4.phd.dev.com canonicalName:hdm1.gphd.local

Resolution

There is a bad entry in /etc/hosts for IP address 10.181.22.149 causing java to return the canonical name of hdm1.gphd.local for the NameNode.

10.181.22.149 hdm1.gphd.local

Commenting out this entry in /etc/hosts will resolve the issue.

java -classpath HadoopDNSVerifier-1.0.jar hadoop.troubleshooting.HadoopDNSVerifier.CheckRemote hdm4.phd.dev.com

IP:10.181.22.149 hostname:hdm4.phd.dev.com canonicalName:hdm4.phd.dev.com

Other notes

  • Use the HadoopDNSVerifier to troubleshoot DNS configuration issues in a hadoop cluster.
  • Update the HADOOP_OPTS entry with an additional param sun.security.krb5.debug=true in /etc/gphd/hadoop/conf/hadoop-env.sh to enable Kerberos debugging.
    export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -Dsun.security.krb5.debug=true ${HADOOP_OPTS}