Last active
December 19, 2018 19:12
-
-
Save sai-inampudi/9e1e823096d2685ed2282827432ef311 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Verify hadoop version: | |
13:07:13 $ hadoop version | |
Hadoop 2.6.0-cdh5.14.0 | |
Subversion http://github.com/cloudera/hadoop -r 9b197d35839383c798c618ba917ccaa196a17699 | |
Compiled by jenkins on 2018-01-06T21:36Z | |
Compiled with protoc 2.5.0 | |
From source with checksum f4ddee45985a34faa91db2aad8731f | |
This command was run using /opt/cloudera/parcels/CDH-5.14.0-1.cdh5.14.0.p2869.3125/jars/hadoop-common-2.6.0-cdh5.14.0.jar | |
Flink version being used: flink-1.5.4-bin-hadoop26-scala_2.11, downloaded from https://archive.apache.org/dist/flink/flink-1.5.4/flink-1.5.4-bin-hadoop26-scala_2.11.tgz | |
Made sure to do a kinit on the node successfully, and then verified the kerberos ticket cache was created | |
13:06:37 $ klist | |
Ticket cache: FILE:/tmp/krb5cc_17049136_9qVij9 | |
Default principal: SI022833@NORTHAMERICA.CERNER.NET | |
Valid starting Expires Service principal | |
12/19/18 13:06:26 12/19/18 23:06:28 krbtgt/NORTHAMERICA.CERNER.NET@NORTHAMERICA.CERNER.NET | |
renew until 12/26/18 13:06:26 | |
Afterwards, I ran the following command: | |
~/flink-1.5.4/bin/yarn-session.sh -n 5 -tm 2048 -s 4 -d -nm flink_yarn | |
Resulting logs below: | |
2018-12-19 13:06:31,008 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: jobmanager.rpc.address, localhost | |
2018-12-19 13:06:31,010 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: jobmanager.rpc.port, 6123 | |
2018-12-19 13:06:31,010 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: jobmanager.heap.mb, 1024 | |
2018-12-19 13:06:31,010 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: taskmanager.heap.mb, 1024 | |
2018-12-19 13:06:31,010 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: taskmanager.numberOfTaskSlots, 1 | |
2018-12-19 13:06:31,010 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: parallelism.default, 1 | |
2018-12-19 13:06:31,011 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: rest.port, 8081 | |
2018-12-19 13:06:31,012 INFO org.apache.flink.configuration.GlobalConfiguration - Loading configuration property: security.kerberos.login.use-ticket-cache, true | |
2018-12-19 13:06:31,582 WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
2018-12-19 13:06:31,852 INFO org.apache.flink.runtime.security.modules.HadoopModule - Hadoop user set to si022833 (auth:SIMPLE) | |
2018-12-19 13:06:32,101 INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at /0.0.0.0:8032 | |
2018-12-19 13:06:33,555 INFO org.apache.hadoop.ipc.Client - Retrying connect to server: 0.0.0.0/0.0.0.0:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS) | |
2018-12-19 13:06:34,557 INFO org.apache.hadoop.ipc.Client - Retrying connect to server: 0.0.0.0/0.0.0.0:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS) | |
2018-12-19 13:06:35,559 INFO org.apache.hadoop.ipc.Client - Retrying connect to server: 0.0.0.0/0.0.0.0:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS) | |
2018-12-19 13:06:36,561 INFO org.apache.hadoop.ipc.Client - Retrying connect to server: 0.0.0.0/0.0.0.0:8032. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment