Uncategorized

HDFS – Listing / Viewing Files when Kerberos Installed

When using Hadoop HDFS cluster when Kerberos is installed, you need to provide credentials to list/view files via command line.

In a non-kerberos environment, as “hadoop” or “hdfs” user you can use the following command without any authentication issues

bash:localhost:> hadoop fs -ls /user

However with Kerberos you have to set the environment first

  • Do a kinit using a principal who has access to hadoop. Assuming a principal name “hdp@REALM.COM“.
  • kinit -k -t hdp@REALM.COM -> if you are using a keytab to login
  • kinit hdp@REALM.COM -> if you are using principal and its password to login.
  • Run ls command to see the list of files or cat to view the contents.
  • hadoop hdfs dfs -ls /user/hdp (or a different location within hdfs)

This may also work

kinit -kt `whoami`.keytab `whoami`@INTRANET.SERVER.COM

placing the key-tab file in hdfs user home folder and adding above in start of all scripts…. i.e. then run “hadoop fs -ls ” etc..

Reference

https://community.hortonworks.com/questions/52615/how-to-view-hdfs-files-when-kerberos-is-installed.html

https://stackoverflow.com/questions/31020669/authenticate-scripts-on-hdfs-using-key-tab-file

 

Leave a Reply

Your email address will not be published. Required fields are marked *