Login to your remote-clusters and run ktutil
. In this example, username is hoa
and realm is SAIKOCAT.COM
[hoa@remote-clusters ~]$ ktutil
ktutil: addent -password -p hoa@SAIKOCAT.COM -k 1 -e aes256-cts
Password for hoa@SAIKOCAT.COM:
ktutil: addent -password -p hoa@SAIKOCAT.COM -k 1 -e rc4-hmac
Password for hoa@SAIKOCAT.COM:
ktutil: wkt hoa.keytab
ktutil: quit
[hoa@remote-clusters ~]$ chmod 0700 hoa.keytab
Download the newly created keytab to your local machine
[hoa@local ~]$ scp hoa@remote-clusters:~/hoa.keytab ~/
Set-up your krb5 clients (krb5-clients, krb5-users
, etc. depending on your distributions)
See [krb5.conf] file.
Obtain Hadoop configurations from your cluster administration and save to say '~/hadoop-conf/'.
See [Remote-Hadoop-cluster-conf-files.out] file.
Authenticate yourself with the cluster (also needed when your ticket_lifetime is expired)
[hoa@local ~]$ kinit hoa@SAIKOCAT.COM -k -t ~/hoa.keytab
Then after you have set up your Pig. You can now send job remotely
[hoa@local ~]$ export HADOOP_CONF_DIR="/home/hoa/hadoop-conf/"
[hoa@local ~]$ pig -P ./pig.properties remote.pig
Cheers