Skip to content

Instantly share code, notes, and snippets.

@dedunumax
Created May 21, 2015 10:04
Show Gist options
  • Save dedunumax/8dbd361316c49fcb1fc2 to your computer and use it in GitHub Desktop.
Save dedunumax/8dbd361316c49fcb1fc2 to your computer and use it in GitHub Desktop.
15/05/21 09:48:09 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
15/05/21 09:48:10 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
15/05/21 09:48:10 INFO input.FileInputFormat: Total input paths to process : 1
15/05/21 09:48:10 INFO input.FileInputFormat: Total input paths to process : 1
15/05/21 09:48:10 INFO input.FileInputFormat: Total input paths to process : 1
15/05/21 09:48:10 INFO mapreduce.JobSubmitter: number of splits:3
15/05/21 09:48:10 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1432197111554_0002
15/05/21 09:48:10 INFO impl.YarnClientImpl: Submitted application application_1432197111554_0002
15/05/21 09:48:10 INFO mapreduce.Job: The url to track the job: http://hdp101.local:8088/proxy/application_1432197111554_0002/
15/05/21 09:48:10 INFO mapreduce.Job: Running job: job_1432197111554_0002
15/05/21 09:48:17 INFO mapreduce.Job: Job job_1432197111554_0002 running in uber mode : false
15/05/21 09:48:17 INFO mapreduce.Job: map 0% reduce 0%
15/05/21 09:48:26 INFO mapreduce.Job: map 33% reduce 0%
15/05/21 09:48:27 INFO mapreduce.Job: map 100% reduce 0%
15/05/21 09:48:32 INFO mapreduce.Job: map 100% reduce 100%
15/05/21 09:48:33 INFO mapreduce.Job: Job job_1432197111554_0002 completed successfully
15/05/21 09:48:33 INFO mapreduce.Job: Counters: 49
File System Counters
FILE: Number of bytes read=276
FILE: Number of bytes written=425877
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=1457
HDFS: Number of bytes written=79
HDFS: Number of read operations=12
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=3
Launched reduce tasks=1
Data-local map tasks=3
Total time spent by all maps in occupied slots (ms)=22080
Total time spent by all reduces in occupied slots (ms)=3382
Total time spent by all map tasks (ms)=22080
Total time spent by all reduce tasks (ms)=3382
Total vcore-seconds taken by all map tasks=22080
Total vcore-seconds taken by all reduce tasks=3382
Total megabyte-seconds taken by all map tasks=22609920
Total megabyte-seconds taken by all reduce tasks=3463168
Map-Reduce Framework
Map input records=18
Map output records=18
Map output bytes=234
Map output materialized bytes=288
Input split bytes=825
Combine input records=0
Combine output records=0
Reduce input groups=6
Reduce shuffle bytes=288
Reduce input records=18
Reduce output records=6
Spilled Records=36
Shuffled Maps =3
Failed Shuffles=0
Merged Map outputs=3
GC time elapsed (ms)=166
CPU time spent (ms)=2100
Physical memory (bytes) snapshot=987889664
Virtual memory (bytes) snapshot=2790764544
Total committed heap usage (bytes)=721813504
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=79
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment