Skip to content

Instantly share code, notes, and snippets.

@ryan-williams
Created September 27, 2015 17:43
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ryan-williams/34210ad640687113e5c3 to your computer and use it in GitHub Desktop.
Save ryan-williams/34210ad640687113e5c3 to your computer and use it in GitHub Desktop.
"Provider org.apache.hadoop.fs.s3.S3FileSystem not found" error at different Spark versions - caused by stray META-INF in the directory spark-shell was launched from.
$ spark-select 1.4.0
$ $SPARK_HOME/bin/spark-shell
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/09/26 05:51:04 INFO SecurityManager: Changing view acls to: willir31
15/09/26 05:51:04 INFO SecurityManager: Changing modify acls to: willir31
15/09/26 05:51:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(willir31); users with modify permissions: Set(willir31)
15/09/26 05:51:04 INFO HttpServer: Starting HTTP Server
15/09/26 05:51:04 INFO Utils: Successfully started service 'HTTP class server' on port 48152.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.4.0
/_/
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/26 05:51:10 INFO SparkContext: Running Spark version 1.4.0
15/09/26 05:51:10 INFO SecurityManager: Changing view acls to: willir31
15/09/26 05:51:10 INFO SecurityManager: Changing modify acls to: willir31
15/09/26 05:51:10 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(willir31); users with modify permissions: Set(willir31)
15/09/26 05:51:11 INFO Slf4jLogger: Slf4jLogger started
15/09/26 05:51:11 INFO Remoting: Starting remoting
15/09/26 05:51:11 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@172.29.46.14:35873]
15/09/26 05:51:11 INFO Utils: Successfully started service 'sparkDriver' on port 35873.
15/09/26 05:51:11 INFO SparkEnv: Registering MapOutputTracker
15/09/26 05:51:11 INFO SparkEnv: Registering BlockManagerMaster
15/09/26 05:51:11 INFO DiskBlockManager: Created local directory at /tmp/spark-36c52d6f-631c-4892-af6f-574847448d47/blockmgr-00d7cfa3-9153-458e-a60b-cae8b5ed32b8
15/09/26 05:51:11 INFO MemoryStore: MemoryStore started with capacity 265.4 MB
15/09/26 05:51:11 INFO HttpFileServer: HTTP File server directory is /tmp/spark-36c52d6f-631c-4892-af6f-574847448d47/httpd-efb8453c-3a33-4e4b-af48-d3abc77c2a73
15/09/26 05:51:11 INFO HttpServer: Starting HTTP Server
15/09/26 05:51:11 INFO Utils: Successfully started service 'HTTP file server' on port 45201.
15/09/26 05:51:12 INFO SparkEnv: Registering OutputCommitCoordinator
15/09/26 05:51:12 INFO Utils: Successfully started service 'SparkUI' on port 4040.
15/09/26 05:51:12 INFO SparkUI: Started SparkUI at http://172.29.46.14:4040
15/09/26 05:51:12 INFO Executor: Starting executor ID driver on host localhost
15/09/26 05:51:12 INFO Executor: Using REPL class URI: http://172.29.46.14:48152
15/09/26 05:51:12 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 38266.
15/09/26 05:51:12 INFO NettyBlockTransferService: Server created on 38266
15/09/26 05:51:12 INFO BlockManagerMaster: Trying to register BlockManager
15/09/26 05:51:12 INFO BlockManagerMasterEndpoint: Registering block manager localhost:38266 with 265.4 MB RAM, BlockManagerId(driver, localhost, 38266)
15/09/26 05:51:12 INFO BlockManagerMaster: Registered BlockManager
15/09/26 05:51:13 INFO SparkILoop: Created spark context..
Spark context available as sc.
15/09/26 05:51:14 INFO HiveContext: Initializing execution hive, version 0.13.1
15/09/26 05:51:15 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
15/09/26 05:51:15 INFO ObjectStore: ObjectStore, initialize called
15/09/26 05:51:15 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
15/09/26 05:51:15 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
15/09/26 05:51:15 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/09/26 05:51:16 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/09/26 05:51:21 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
15/09/26 05:51:21 INFO MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5. Encountered: "@" (64), after : "".
15/09/26 05:51:22 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
15/09/26 05:51:22 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
15/09/26 05:51:26 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
15/09/26 05:51:26 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
15/09/26 05:51:27 INFO ObjectStore: Initialized ObjectStore
15/09/26 05:51:27 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 0.13.1aa
15/09/26 05:51:29 INFO HiveMetaStore: Added admin role in metastore
15/09/26 05:51:29 INFO HiveMetaStore: Added public role in metastore
15/09/26 05:51:29 INFO HiveMetaStore: No user is added in admin role, since config is empty
15/09/26 05:51:29 INFO SessionState: No Tez session required at this point. hive.execution.engine=mr.
15/09/26 05:51:29 INFO SparkILoop: Created sql context (with Hive support)..
SQL context available as sqlContext.
scala>
$ spark-select 1.4.1
$ $SPARK_HOME/bin/spark-shell
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/09/26 05:51:44 INFO SecurityManager: Changing view acls to: willir31
15/09/26 05:51:44 INFO SecurityManager: Changing modify acls to: willir31
15/09/26 05:51:44 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(willir31); users with modify permissions: Set(willir31)
15/09/26 05:51:44 INFO HttpServer: Starting HTTP Server
15/09/26 05:51:44 INFO Utils: Successfully started service 'HTTP class server' on port 47400.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.4.1
/_/
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/26 05:51:47 INFO SparkContext: Running Spark version 1.4.1
15/09/26 05:51:47 INFO SecurityManager: Changing view acls to: willir31
15/09/26 05:51:47 INFO SecurityManager: Changing modify acls to: willir31
15/09/26 05:51:47 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(willir31); users with modify permissions: Set(willir31)
15/09/26 05:51:48 INFO Slf4jLogger: Slf4jLogger started
15/09/26 05:51:48 INFO Remoting: Starting remoting
15/09/26 05:51:48 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@172.29.46.14:59308]
15/09/26 05:51:48 INFO Utils: Successfully started service 'sparkDriver' on port 59308.
15/09/26 05:51:48 INFO SparkEnv: Registering MapOutputTracker
15/09/26 05:51:48 INFO SparkEnv: Registering BlockManagerMaster
15/09/26 05:51:48 INFO DiskBlockManager: Created local directory at /tmp/spark-8ef69b35-1a6a-4ec9-9249-34a7a145278c/blockmgr-e64307ef-a982-4925-8cbc-ba7a8152f4f4
15/09/26 05:51:48 INFO MemoryStore: MemoryStore started with capacity 265.4 MB
15/09/26 05:51:48 INFO HttpFileServer: HTTP File server directory is /tmp/spark-8ef69b35-1a6a-4ec9-9249-34a7a145278c/httpd-7ee5bf68-8941-4506-b2dc-34b867443c04
15/09/26 05:51:48 INFO HttpServer: Starting HTTP Server
15/09/26 05:51:48 INFO Utils: Successfully started service 'HTTP file server' on port 37784.
15/09/26 05:51:48 INFO SparkEnv: Registering OutputCommitCoordinator
15/09/26 05:51:48 INFO Utils: Successfully started service 'SparkUI' on port 4040.
15/09/26 05:51:48 INFO SparkUI: Started SparkUI at http://172.29.46.14:4040
15/09/26 05:51:48 INFO Executor: Starting executor ID driver on host localhost
15/09/26 05:51:48 INFO Executor: Using REPL class URI: http://172.29.46.14:47400
15/09/26 05:51:48 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 56090.
15/09/26 05:51:48 INFO NettyBlockTransferService: Server created on 56090
15/09/26 05:51:48 INFO BlockManagerMaster: Trying to register BlockManager
15/09/26 05:51:48 INFO BlockManagerMasterEndpoint: Registering block manager localhost:56090 with 265.4 MB RAM, BlockManagerId(driver, localhost, 56090)
15/09/26 05:51:48 INFO BlockManagerMaster: Registered BlockManager
15/09/26 05:51:49 INFO SparkILoop: Created spark context..
Spark context available as sc.
15/09/26 05:51:49 INFO HiveContext: Initializing execution hive, version 0.13.1
15/09/26 05:51:50 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
15/09/26 05:51:50 INFO ObjectStore: ObjectStore, initialize called
15/09/26 05:51:50 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/hpc/users/willir31/sparks/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/demeter/users/willir31/sparks/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar."
15/09/26 05:51:50 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/hpc/users/willir31/sparks/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/demeter/users/willir31/sparks/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar."
15/09/26 05:51:50 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/hpc/users/willir31/sparks/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/demeter/users/willir31/sparks/spark-1.4.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar."
15/09/26 05:51:50 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
15/09/26 05:51:50 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
15/09/26 05:51:50 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/09/26 05:51:50 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/09/26 05:51:56 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
15/09/26 05:51:56 INFO MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5. Encountered: "@" (64), after : "".
15/09/26 05:51:57 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
15/09/26 05:51:57 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
15/09/26 05:52:01 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
15/09/26 05:52:01 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
15/09/26 05:52:03 INFO ObjectStore: Initialized ObjectStore
15/09/26 05:52:03 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 0.13.1aa
java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:116)
at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163)
at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:168)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$$iwC.<init>(<console>:9)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:130)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
... 56 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
... 61 more
Caused by: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.s3.S3FileSystem not found
at java.util.ServiceLoader.fail(ServiceLoader.java:231)
at java.util.ServiceLoader.access$300(ServiceLoader.java:181)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:365)
at java.util.ServiceLoader$1.next(ServiceLoader.java:445)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2563)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2574)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:354)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at org.apache.hadoop.hive.metastore.Warehouse.getFs(Warehouse.java:112)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:144)
at org.apache.hadoop.hive.metastore.Warehouse.getWhRoot(Warehouse.java:159)
at org.apache.hadoop.hive.metastore.Warehouse.getDefaultDatabasePath(Warehouse.java:177)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:504)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)
... 66 more
<console>:10: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:10: error: not found: value sqlContext
import sqlContext.sql
^
scala>
$ spark-select 1.5.0
$ $SPARK_HOME/bin/spark-shell
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.5.0
/_/
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/26 05:55:49 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
Spark context available as sc.
15/09/26 05:55:50 WARN HiveConf: HiveConf of name hive.enable.spark.execution.engine does not exist
java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.s3.S3FileSystem not found
at java.util.ServiceLoader.fail(ServiceLoader.java:231)
at java.util.ServiceLoader.access$300(ServiceLoader.java:181)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:365)
at java.util.ServiceLoader$1.next(ServiceLoader.java:445)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2563)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2574)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:505)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163)
at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:168)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$$iwC.<init>(<console>:9)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
<console>:10: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:10: error: not found: value sqlContext
import sqlContext.sql
^
scala>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment