Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
➜ redspark git:(master) ✗ spark-submit --class "org.jruby.Main" --master local\[4\] --verbose build/libs/redspark-jruby-1.0-SNAPSHOT.jar simple.rb [30/608]
Using properties file: null
19/05/18 18:13:22 WARN Utils: Your hostname, grape resolves to a loopback address: 127.0.0.1; using 192.168.1.102 instead (on interface wlp58s0)
19/05/18 18:13:22 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Parsed arguments:
master local[4]
deployMode null
executorMemory null
executorCores null
totalExecutorCores null
propertiesFile null
driverMemory null
driverCores null
driverExtraClassPath null
driverExtraLibraryPath null
driverExtraJavaOptions null
supervise false
queue null
numExecutors null
files null
pyFiles null
archives null
mainClass org.jruby.Main
primaryResource file:/home/tyler/source/github/jruby-gradle/redspark/build/libs/redspark-jruby-1.0-SNAPSHOT.jar
name org.jruby.Main
childArgs [simple.rb]
jars null
packages null
packagesExclusions null
repositories null
verbose true
Spark properties used, including those specified through
--conf and those from the properties file null:
19/05/18 18:13:23 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Main class:
org.jruby.Main
Arguments:
simple.rb
Spark config:
(spark.master,local[4])
(spark.jars,file:/home/tyler/source/github/jruby-gradle/redspark/build/libs/redspark-jruby-1.0-SNAPSHOT.jar)
(spark.submit.deployMode,client)
(spark.app.name,org.jruby.Main)
Classpath elements:
file:/home/tyler/source/github/jruby-gradle/redspark/build/libs/redspark-jruby-1.0-SNAPSHOT.jar
>> LOADED RUBY SCRIPT
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/05/18 18:13:26 INFO SparkContext: Running Spark version 2.4.3
19/05/18 18:13:26 INFO SparkContext: Submitted application: Simple Application
19/05/18 18:13:26 INFO SecurityManager: Changing view acls to: tyler
19/05/18 18:13:26 INFO SecurityManager: Changing modify acls to: tyler
19/05/18 18:13:26 INFO SecurityManager: Changing view acls groups to:
19/05/18 18:13:26 INFO SecurityManager: Changing modify acls groups to:
19/05/18 18:13:26 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(tyler); groups with view permissions: Set(); users with modify permissions: Set(tyler); groups with modify permissions: Set()
19/05/18 18:13:26 INFO Utils: Successfully started service 'sparkDriver' on port 35313.
19/05/18 18:13:26 INFO SparkEnv: Registering MapOutputTracker
19/05/18 18:13:26 INFO SparkEnv: Registering BlockManagerMaster
19/05/18 18:13:26 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/05/18 18:13:26 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/05/18 18:13:26 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-a47f4ae5-3558-464c-9fb1-f4eb434f9db0
19/05/18 18:13:26 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
19/05/18 18:13:26 INFO SparkEnv: Registering OutputCommitCoordinator
19/05/18 18:13:26 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/05/18 18:13:26 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.1.102:4040
19/05/18 18:13:26 INFO SparkContext: Added JAR file:/home/tyler/source/github/jruby-gradle/redspark/build/libs/redspark-jruby-1.0-SNAPSHOT.jar at spark://192.168.1.102:35313/jars/redspark-jruby-1.0-SNAPSHOT.jar with timestamp 1558228406837
19/05/18 18:13:26 INFO Executor: Starting executor ID driver on host localhost
19/05/18 18:13:27 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44509.
19/05/18 18:13:27 INFO NettyBlockTransferService: Server created on 192.168.1.102:44509
19/05/18 18:13:27 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/05/18 18:13:27 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.1.102, 44509, None)
19/05/18 18:13:27 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.102:44509 with 366.3 MB RAM, BlockManagerId(driver, 192.168.1.102, 44509, None)
19/05/18 18:13:27 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.1.102, 44509, None)
19/05/18 18:13:27 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.1.102, 44509, None)
19/05/18 18:13:27 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/home/tyler/source/github/jruby-gradle/redspark/spark-warehouse').
19/05/18 18:13:27 INFO SharedState: Warehouse path is 'file:/home/tyler/source/github/jruby-gradle/redspark/spark-warehouse'.
19/05/18 18:13:27 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
19/05/18 18:13:29 INFO FileSourceStrategy: Pruning directories with:
19/05/18 18:13:29 INFO FileSourceStrategy: Post-Scan Filters:
19/05/18 18:13:29 INFO FileSourceStrategy: Output Data Schema: struct<value: string>
19/05/18 18:13:29 INFO FileSourceScanExec: Pushed Filters:
Hello world from Ruby
19/05/18 18:13:29 INFO SparkUI: Stopped Spark web UI at http://192.168.1.102:4040
19/05/18 18:13:29 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/05/18 18:13:29 INFO MemoryStore: MemoryStore cleared
19/05/18 18:13:29 INFO BlockManager: BlockManager stopped
19/05/18 18:13:29 INFO BlockManagerMaster: BlockManagerMaster stopped
19/05/18 18:13:29 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/05/18 18:13:29 INFO SparkContext: Successfully stopped SparkContext
19/05/18 18:13:29 INFO ShutdownHookManager: Shutdown hook called
19/05/18 18:13:29 INFO ShutdownHookManager: Deleting directory /tmp/spark-7f8c683d-b5ed-49b4-ac11-6901f79386b5
19/05/18 18:13:29 INFO ShutdownHookManager: Deleting directory /tmp/spark-998e70ba-22fe-4ec3-9255-b398dc5eb037
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment