Skip to content

Instantly share code, notes, and snippets.

@daschl

daschl/log4j Secret

Created December 3, 2021 07:17
Show Gist options
  • Save daschl/8f3e996caf003a903006fff57d6396e3 to your computer and use it in GitHub Desktop.
Save daschl/8f3e996caf003a903006fff57d6396e3 to your computer and use it in GitHub Desktop.
Spark Driver Logs
mon: Started comm channel server
21/12/03 07:00:35 INFO DriverDaemon: Driver daemon started.
21/12/03 07:00:37 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:00:37 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:00:37 INFO DriverCorral: Loading the root classloader
21/12/03 07:00:37 INFO DriverCorral: Starting sql repl ReplId-6f91f-a5678-c159b-7
21/12/03 07:00:37 INFO DriverCorral: Starting sql repl ReplId-7267c-668a8-9ada0-0
21/12/03 07:00:37 INFO DriverCorral: Starting sql repl ReplId-1c64a-3f8fc-5c540-2
21/12/03 07:00:37 INFO DriverCorral: Starting sql repl ReplId-3b491-f76ff-3cbdd-d
21/12/03 07:00:37 INFO DriverCorral: Starting sql repl ReplId-594cf-2f68f-28a85-6
21/12/03 07:00:37 INFO SQLDriverWrapper: setupRepl:ReplId-7267c-668a8-9ada0-0: finished to load
21/12/03 07:00:37 INFO SQLDriverWrapper: setupRepl:ReplId-3b491-f76ff-3cbdd-d: finished to load
21/12/03 07:00:37 INFO SQLDriverWrapper: setupRepl:ReplId-1c64a-3f8fc-5c540-2: finished to load
21/12/03 07:00:37 INFO SQLDriverWrapper: setupRepl:ReplId-594cf-2f68f-28a85-6: finished to load
21/12/03 07:00:37 INFO SQLDriverWrapper: setupRepl:ReplId-6f91f-a5678-c159b-7: finished to load
21/12/03 07:00:37 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:00:37 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:00:37 INFO DriverCorral: Starting r repl ReplId-1e428-1be4b-ff00d-f
21/12/03 07:00:37 INFO ROutputStreamHandler: Connection succeeded on port 36073
21/12/03 07:00:37 INFO ROutputStreamHandler: Connection succeeded on port 36051
21/12/03 07:00:37 INFO RDriverLocal: 1. RDriverLocal.efa18815-13e1-479d-a557-6e3463523e30: object created with for ReplId-1e428-1be4b-ff00d-f.
21/12/03 07:00:37 INFO RDriverLocal: 2. RDriverLocal.efa18815-13e1-479d-a557-6e3463523e30: initializing ...
21/12/03 07:00:38 INFO RDriverLocal: 3. RDriverLocal.efa18815-13e1-479d-a557-6e3463523e30: started RBackend thread on port 42431
21/12/03 07:00:38 INFO RDriverLocal: 4. RDriverLocal.efa18815-13e1-479d-a557-6e3463523e30: waiting for SparkR to be installed ...
21/12/03 07:00:59 INFO RDriverLocal$: SparkR installation completed.
21/12/03 07:00:59 INFO RDriverLocal: 5. RDriverLocal.efa18815-13e1-479d-a557-6e3463523e30: launching R process ...
21/12/03 07:00:59 INFO RDriverLocal: 6. RDriverLocal.efa18815-13e1-479d-a557-6e3463523e30: cgroup isolation disabled, not placing R process in REPL cgroup.
21/12/03 07:00:59 INFO RDriverLocal: 7. RDriverLocal.efa18815-13e1-479d-a557-6e3463523e30: starting R process on port 1100 (attempt 1) ...
21/12/03 07:00:59 INFO RDriverLocal: 8. RDriverLocal.efa18815-13e1-479d-a557-6e3463523e30: setting up BufferedStreamThread with bufferSize: 1000.
21/12/03 07:01:00 INFO RDriverLocal: 9. RDriverLocal.efa18815-13e1-479d-a557-6e3463523e30: R process started with RServe listening on port 1100.
21/12/03 07:01:00 INFO RDriverLocal: 10. RDriverLocal.efa18815-13e1-479d-a557-6e3463523e30: starting interpreter to talk to R process ...
21/12/03 07:01:01 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
21/12/03 07:01:01 WARN SparkSession$Builder: Using an existing SparkSession; the static sql configurations will not take effect.
21/12/03 07:01:01 INFO ROutputStreamHandler: Successfully connected to stdout in the RShell.
21/12/03 07:01:01 INFO ROutputStreamHandler: Successfully connected to stderr in the RShell.
21/12/03 07:01:01 INFO RDriverLocal: 11. RDriverLocal.efa18815-13e1-479d-a557-6e3463523e30: R interpreter is connected.
21/12/03 07:01:01 INFO RDriverWrapper: setupRepl:ReplId-1e428-1be4b-ff00d-f: finished to load
21/12/03 07:05:35 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:05:35 INFO HiveUtils: Initializing HiveMetastoreConnection version 0.13.0 using file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive--hive-exec--org.apache.hive__hive-exec__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive.shims--hive-shims-0.20--org.apache.hive.shims__hive-shims-0.20__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive.shims--hive-shims-common-secure--org.apache.hive.shims__hive-shims-common-secure__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--asm--asm-commons--asm__asm-commons__3.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--commons-io--commons-io--commons-io__commons-io__2.5.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive.shims--hive-shims-0.20S--org.apache.hive.shims__hive-shims-0.20S__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.thrift--libfb303--org.apache.thrift__libfb303__0.9.0.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--antlr--antlr--antlr__antlr__2.7.7.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.objenesis--objenesis--org.objenesis__objenesis__1.2.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.eclipse.jetty.aggregate--jetty-all--org.eclipse.jetty.aggregate__jetty-all__7.6.0.v20120127.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.zookeeper--zookeeper--org.apache.zookeeper__zookeeper__3.4.5.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--stax--stax-api--stax__stax-api__1.0.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.mortbay.jetty--jetty--org.mortbay.jetty__jetty__6.1.26.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--commons-lang--commons-lang--commons-lang__commons-lang__2.4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--oro--oro--oro__oro__2.0.8.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.ow2.asm--asm--org.ow2.asm__asm__4.0.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive--hive-service--org.apache.hive__hive-service__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive--hive-serde--org.apache.hive__hive-serde__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.antlr--stringtemplate--org.antlr__stringtemplate__3.2.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.codehaus.jackson--jackson-mapper-asl--org.codehaus.jackson__jackson-mapper-asl__1.9.13.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.ant--ant--org.apache.ant__ant__1.9.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--com.esotericsoftware.reflectasm--reflectasm-shaded--com.esotericsoftware.reflectasm__reflectasm-shaded__1.07.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.derby--derby--org.apache.derby__derby__10.10.1.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--junit--junit--junit__junit__3.8.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--commons-httpclient--commons-httpclient--commons-httpclient__commons-httpclient__3.0.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.geronimo.specs--geronimo-annotation_1.0_spec--org.apache.geronimo.specs__geronimo-annotation_1.0_spec__1.1.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive--hive-beeline--org.apache.hive__hive-beeline__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.ant--ant-launcher--org.apache.ant__ant-launcher__1.9.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--log4j--log4j--log4j__log4j__1.2.16.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--commons-codec--commons-codec--commons-codec__commons-codec__1.8.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive--hive-metastore--org.apache.hive__hive-metastore__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--commons-collections--commons-collections--commons-collections__commons-collections__3.2.2.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--com.jolbox--bonecp--com.jolbox__bonecp__0.8.0.RELEASE.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.httpcomponents--httpclient--org.apache.httpcomponents__httpclient__4.4.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.slf4j--slf4j-api--org.slf4j__slf4j-api__1.7.5.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.xerial.snappy--snappy-java--org.xerial.snappy__snappy-java__1.0.5.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.commons--commons-compress--org.apache.commons__commons-compress__1.9.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.commons--commons-lang3--org.apache.commons__commons-lang3__3.4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.thrift--libthrift--org.apache.thrift__libthrift__0.9.2.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--jline--jline--jline__jline__0.9.94.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive--hive-ant--org.apache.hive__hive-ant__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--asm--asm-tree--asm__asm-tree__3.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--net.sf.jpam--jpam--net.sf.jpam__jpam__1.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.datanucleus--datanucleus-api-jdo--org.datanucleus__datanucleus-api-jdo__4.2.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.httpcomponents--httpcore--org.apache.httpcomponents__httpcore__4.2.5.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--javax.transaction--transaction-api--javax.transaction__transaction-api__1.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.datanucleus--datanucleus-rdbms--org.datanucleus__datanucleus-rdbms__4.1.7.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--asm--asm--asm__asm__3.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.mortbay.jetty--servlet-api--org.mortbay.jetty__servlet-api__2.5-20081211.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.geronimo.specs--geronimo-jaspic_1.0_spec--org.apache.geronimo.specs__geronimo-jaspic_1.0_spec__1.0.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.geronimo.specs--geronimo-jta_1.1_spec--org.apache.geronimo.specs__geronimo-jta_1.1_spec__1.1.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--javax.activation--activation--javax.activation__activation__1.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive--hive-jdbc--org.apache.hive__hive-jdbc__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.codehaus.jackson--jackson-core-asl--org.codehaus.jackson__jackson-core-asl__1.9.13.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive.shims--hive-shims-0.23--org.apache.hive.shims__hive-shims-0.23__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--com.esotericsoftware.kryo--kryo--com.esotericsoftware.kryo__kryo__2.21.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.slf4j--slf4j-log4j12--org.slf4j__slf4j-log4j12__1.7.5.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.mortbay.jetty--jetty-util--org.mortbay.jetty__jetty-util__6.1.26.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--commons-logging--commons-logging--commons-logging__commons-logging__1.1.3.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--javax.mail--mail--javax.mail__mail__1.4.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.codehaus.groovy--groovy-all--org.codehaus.groovy__groovy-all__2.1.6.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--javax.jdo--jdo-api--javax.jdo__jdo-api__3.0.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.datanucleus--javax.jdo--org.datanucleus__javax.jdo__3.2.0-m3.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive.shims--hive-shims-common--org.apache.hive.shims__hive-shims-common__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.antlr--antlr-runtime--org.antlr__antlr-runtime__3.4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--javax.servlet--servlet-api--javax.servlet__servlet-api__2.5.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.datanucleus--datanucleus-core--org.datanucleus__datanucleus-core__4.1.6.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--com.thoughtworks.paranamer--paranamer--com.thoughtworks.paranamer__paranamer__2.8.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.velocity--velocity--org.apache.velocity__velocity__1.5.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--com.google.guava--guava--com.google.guava__guava__11.0.2.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive--hive-common--org.apache.hive__hive-common__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--com.google.code.findbugs--jsr305--com.google.code.findbugs__jsr305__1.3.9.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--commons-cli--commons-cli--commons-cli__commons-cli__1.2.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--com.esotericsoftware.minlog--minlog--com.esotericsoftware.minlog__minlog__1.2.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive--hive-cli--org.apache.hive__hive-cli__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.hive--hive-shims--org.apache.hive__hive-shims__0.13.1-databricks-4.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.apache.avro--avro--org.apache.avro__avro__1.7.5.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--javax.transaction--jta--javax.transaction__jta__1.1.jar:file:/databricks/databricks-hive/----workspace_spark_3_2--maven-trees--hive-metastore-databricks--org.antlr--ST4--org.antlr__ST4__4.0.4.jar:file:/databricks/databricks-hive/bonecp-configs.jar
21/12/03 07:05:35 INFO DriverCorral: DBFS health check ok
21/12/03 07:05:35 INFO HiveClientImpl: Warehouse location for Hive client (version 0.13.1) is dbfs:/user/hive/warehouse
21/12/03 07:05:36 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
21/12/03 07:05:36 INFO ObjectStore: ObjectStore, initialize called
21/12/03 07:05:36 INFO Persistence: Property datanucleus.fixedDatastore unknown - will be ignored
21/12/03 07:05:36 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
21/12/03 07:05:36 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
21/12/03 07:05:39 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
21/12/03 07:05:39 INFO MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5. Encountered: "@" (64), after : "".
21/12/03 07:05:42 INFO ObjectStore: Initialized ObjectStore
21/12/03 07:05:42 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 0.13.0
21/12/03 07:05:43 INFO HiveMetaStore: Added admin role in metastore
21/12/03 07:05:43 INFO HiveMetaStore: Added public role in metastore
21/12/03 07:05:43 INFO HiveMetaStore: No user is added in admin role, since config is empty
21/12/03 07:05:43 INFO HiveMetaStore: 0: get_database: default
21/12/03 07:05:43 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: default
21/12/03 07:05:43 INFO HiveMetaStore: 0: get_database: default
21/12/03 07:05:43 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: default
21/12/03 07:05:43 INFO DriverCorral: Metastore health check ok
21/12/03 07:06:00 INFO DriverCorral: AttachLibraries - candidate libraries: List(JavaJarId(dbfs:/FileStore/jars/acfaf781_0c8b_46b1_a8e5_dc60cc54663a-spark_connector_assembly_3_2_0_SNAPSHOT-91282.jar,,NONE))
21/12/03 07:06:00 INFO DriverCorral: AttachLibraries - new libraries to install (including resolved dependencies): List(JavaJarId(dbfs:/FileStore/jars/acfaf781_0c8b_46b1_a8e5_dc60cc54663a-spark_connector_assembly_3_2_0_SNAPSHOT-91282.jar,,NONE))
21/12/03 07:06:00 INFO SharedDriverContext: attachLibrariesToSpark JavaJarId(dbfs:/FileStore/jars/acfaf781_0c8b_46b1_a8e5_dc60cc54663a-spark_connector_assembly_3_2_0_SNAPSHOT-91282.jar,,NONE)
21/12/03 07:06:00 INFO LibraryDownloadManager: Downloading a library that was not in the cache: JavaJarId(dbfs:/FileStore/jars/acfaf781_0c8b_46b1_a8e5_dc60cc54663a-spark_connector_assembly_3_2_0_SNAPSHOT-91282.jar,,NONE)
21/12/03 07:06:00 INFO LibraryDownloadManager: Attempt 1: wait until library JavaJarId(dbfs:/FileStore/jars/acfaf781_0c8b_46b1_a8e5_dc60cc54663a-spark_connector_assembly_3_2_0_SNAPSHOT-91282.jar,,NONE) is downloaded
21/12/03 07:06:00 INFO DbfsBlockInputStream: Created remote input stream for block 0
21/12/03 07:06:01 INFO LibraryDownloadManager: Downloaded library JavaJarId(dbfs:/FileStore/jars/acfaf781_0c8b_46b1_a8e5_dc60cc54663a-spark_connector_assembly_3_2_0_SNAPSHOT-91282.jar,,NONE) as local file /local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar in 1463 milliseconds
21/12/03 07:06:01 INFO SharedDriverContext: Successfully saved library JavaJarId(dbfs:/FileStore/jars/acfaf781_0c8b_46b1_a8e5_dc60cc54663a-spark_connector_assembly_3_2_0_SNAPSHOT-91282.jar,,NONE) to local file /local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar
21/12/03 07:06:01 INFO SparkContext: Added file /local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar at file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar with timestamp 1638515161870
21/12/03 07:06:01 INFO Utils: Copying /local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar to /local_disk0/spark-2d2bac26-5d3c-4daf-8661-3eef5f46af82/userFiles-a66bdc58-cc3f-433e-b825-3bf0a0fe1651/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar
21/12/03 07:06:01 INFO SparkContext: Added JAR /local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar at spark://10.172.227.195:33127/jars/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar with timestamp 1638515161968
21/12/03 07:06:01 INFO SharedDriverContext: Successfully attached library dbfs:/FileStore/jars/acfaf781_0c8b_46b1_a8e5_dc60cc54663a-spark_connector_assembly_3_2_0_SNAPSHOT-91282.jar to Spark
21/12/03 07:06:02 INFO LibraryState: Successfully attached library dbfs:/FileStore/jars/acfaf781_0c8b_46b1_a8e5_dc60cc54663a-spark_connector_assembly_3_2_0_SNAPSHOT-91282.jar
21/12/03 07:10:35 INFO DriverCorral: DBFS health check ok
21/12/03 07:10:35 INFO HiveMetaStore: 0: get_database: default
21/12/03 07:10:35 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: default
21/12/03 07:10:35 INFO DriverCorral: Metastore health check ok
21/12/03 07:12:56 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:12:56 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:12:56 INFO DriverCorral: Starting scala repl ReplId-25ce9-923dd-1e062-c
21/12/03 07:12:56 INFO ClusterLoadMonitor: Added query with execution ID:0. Current active queries:1
21/12/03 07:12:56 INFO LogicalPlanStats: Setting LogicalPlanStats visitor to com.databricks.sql.optimizer.statsEstimation.DatabricksLogicalPlanStatsVisitor$
21/12/03 07:12:57 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:12:57 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:12:57 INFO DriverCorral: Starting python repl ReplId-2e435-4d4b2-83855-c
21/12/03 07:12:57 INFO ClusterLoadMonitor: Added query with execution ID:1. Current active queries:2
21/12/03 07:12:57 INFO LogicalPlanStats: Setting LogicalPlanStats visitor to com.databricks.sql.optimizer.statsEstimation.DatabricksLogicalPlanStatsVisitor$
21/12/03 07:12:57 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:12:57 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:12:57 INFO DriverCorral: Starting sql repl ReplId-7b2ec-ec402-c67cc-6
21/12/03 07:12:58 INFO ClusterLoadMonitor: Added query with execution ID:2. Current active queries:3
21/12/03 07:12:58 INFO LogicalPlanStats: Setting LogicalPlanStats visitor to com.databricks.sql.optimizer.statsEstimation.DatabricksLogicalPlanStatsVisitor$
21/12/03 07:12:58 INFO ClusterLoadAvgHelper: Current cluster load: 3, Old Ema: 0.0, New Ema: 3.0
21/12/03 07:12:58 INFO SparkContext: The JAR file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar at spark://10.172.227.195:33127/jars/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar has been added already. Overwriting of added jar is not supported in the current version.
21/12/03 07:12:58 INFO SparkContext: The JAR file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar at spark://10.172.227.195:33127/jars/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar has been added already. Overwriting of added jar is not supported in the current version.
21/12/03 07:12:58 INFO SparkContext: The JAR file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar at spark://10.172.227.195:33127/jars/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar has been added already. Overwriting of added jar is not supported in the current version.
21/12/03 07:12:59 INFO ClusterLoadMonitor: Removed query with execution ID:1. Current active queries:2
21/12/03 07:12:59 INFO ClusterLoadMonitor: Removed query with execution ID:2. Current active queries:1
21/12/03 07:12:59 INFO ClusterLoadMonitor: Removed query with execution ID:0. Current active queries:0
21/12/03 07:12:59 INFO SQLDriverWrapper: setupRepl:ReplId-7b2ec-ec402-c67cc-6: finished to load
21/12/03 07:12:59 INFO ScalaDriverWrapper: setupRepl:ReplId-25ce9-923dd-1e062-c: finished to load
21/12/03 07:12:59 INFO JupyterDriverLocal: Starting gateway server for repl ReplId-2e435-4d4b2-83855-c
21/12/03 07:12:59 INFO PythonPy4JUtil: Using default multithreaded mode in Py4J
21/12/03 07:13:00 INFO ProgressReporter$: Added result fetcher for 8876259256849300678_5273856758392610397_2bba7ea6-99b4-43ca-8f5e-e92e4a27cc41
21/12/03 07:13:00 INFO Utils: resolved command to be run: List(virtualenv, /local_disk0/pythonVirtualEnvDirs/virtualEnv-03fa8d68-7128-437a-b6de-6ebbeb2b5d8a, -p, /databricks/python/bin/python, --no-download, --no-setuptools, --no-wheel)
21/12/03 07:13:01 INFO ClusterLoadMonitor: Added query with execution ID:3. Current active queries:1
21/12/03 07:13:01 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 3.0, New Ema: 2.6999999999999997
21/12/03 07:13:01 INFO HiveMetaStore: 1: get_databases: *
21/12/03 07:13:01 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_databases: *
21/12/03 07:13:01 INFO HiveMetaStore: 1: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
21/12/03 07:13:01 INFO ObjectStore: ObjectStore, initialize called
21/12/03 07:13:02 INFO MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5. Encountered: "@" (64), after : "".
21/12/03 07:13:02 INFO ObjectStore: Initialized ObjectStore
21/12/03 07:13:02 INFO AsyncEventQueue: Process of event SparkListenerSQLUsageLogging(0,1638515578972,CallSite(sql at DriverLocal.scala:222,org.apache.spark.sql.SQLContext.sql(SQLContext.scala:695)
com.databricks.backend.daemon.driver.DriverLocal.$anonfun$new$3(DriverLocal.scala:222)
com.databricks.sql.acl.CheckPermissions$.trusted(CheckPermissions.scala:1453)
com.databricks.backend.daemon.driver.DriverLocal.$anonfun$new$2(DriverLocal.scala:222)
scala.collection.Iterator.foreach(Iterator.scala:943)
scala.collection.Iterator.foreach$(Iterator.scala:943)
scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
scala.collection.IterableLike.foreach(IterableLike.scala:74)
scala.collection.IterableLike.foreach$(IterableLike.scala:73)
scala.collection.AbstractIterable.foreach(Iterable.scala:56)
com.databricks.backend.daemon.driver.DriverLocal.<init>(DriverLocal.scala:206)
com.databricks.backend.daemon.driver.ScalaDriverLocal.<init>(ScalaDriverLocal.scala:51)
com.databricks.backend.daemon.driver.ScalaDriverWrapper.instantiateDriver(DriverWrapper.scala:744)
com.databricks.backend.daemon.driver.DriverWrapper.setupRepl(DriverWrapper.scala:335)
com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:224)
java.lang.Thread.run(Thread.java:748)),org.apache.spark.sql.internal.SQLConf@712733ba,Some(CommandContext(Map(),Map())),== Parsed Logical Plan ==
AddJarsCommand [/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar]
== Analyzed Logical Plan ==
AddJarsCommand [/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar]
== Optimized Logical Plan ==
AddJarsCommand [/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar]
== Physical Plan ==
Execute AddJarsCommand
+- AddJarsCommand [/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar]
,None,None,None,None) by listener SQLAppStatusListener took 3.461842834s.
21/12/03 07:13:03 INFO ProgressReporter$: Added result fetcher for 2724283203918366252_5083101478493670975_089e24cf239e4bed8856ba9d5de733bd
21/12/03 07:13:03 INFO DatabricksUtils: created python virtualenv: /local_disk0/pythonVirtualEnvDirs/virtualEnv-03fa8d68-7128-437a-b6de-6ebbeb2b5d8a
21/12/03 07:13:03 INFO Utils: resolved command to be run: List(/databricks/python/bin/python, -c, import sys; dirs=[p for p in sys.path if 'package' in p]; print(' '.join(dirs)))
21/12/03 07:13:03 INFO Utils: resolved command to be run: List(/local_disk0/pythonVirtualEnvDirs/virtualEnv-03fa8d68-7128-437a-b6de-6ebbeb2b5d8a/bin/python, -c, from distutils.sysconfig import get_python_lib; print(get_python_lib()))
21/12/03 07:13:03 INFO DatabricksUtils: created sites.pth at /local_disk0/pythonVirtualEnvDirs/virtualEnv-03fa8d68-7128-437a-b6de-6ebbeb2b5d8a/lib/python3.8/site-packages/sites.pth
21/12/03 07:13:03 INFO DBUtilsPythonEnvManager: Time spent to start virtualenv /local_disk0/pythonVirtualEnvDirs/virtualEnv-03fa8d68-7128-437a-b6de-6ebbeb2b5d8a is 3067
21/12/03 07:13:03 INFO IpykernelUtils$: Python process builder: [/databricks/spark/python/pyspark/wrapped_python.py, root, /local_disk0/pythonVirtualEnvDirs/virtualEnv-03fa8d68-7128-437a-b6de-6ebbeb2b5d8a/bin/python, /databricks/python_shell/scripts/db_ipykernel_launcher.py, -f, /tmp/ipykernel-connection-ReplId-2e435-4d4b2-83855-c.json]
21/12/03 07:13:03 INFO SignalUtils: Registering signal handler for INT
21/12/03 07:13:04 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 2.6999999999999997, New Ema: 2.445
21/12/03 07:13:04 INFO CodeGenerator: Code generated in 2069.015146 ms
21/12/03 07:13:05 INFO ClusterLoadMonitor: Removed query with execution ID:3. Current active queries:0
21/12/03 07:13:05 INFO ClusterLoadMonitor: Added query with execution ID:4. Current active queries:1
21/12/03 07:13:05 INFO CodeGenerator: Code generated in 46.849015 ms
21/12/03 07:13:06 INFO ClusterLoadMonitor: Removed query with execution ID:4. Current active queries:0
21/12/03 07:13:06 INFO CodeGenerator: Code generated in 99.969063 ms
21/12/03 07:13:06 INFO ProgressReporter$: Removed result fetcher for 8876259256849300678_5273856758392610397_2bba7ea6-99b4-43ca-8f5e-e92e4a27cc41
21/12/03 07:13:07 INFO AsyncEventQueue: Process of event SparkListenerSQLUsageLogging(3,1638515585018,CallSite(sql at SQLDriverLocal.scala:91,org.apache.spark.sql.SQLContext.sql(SQLContext.scala:695)
com.databricks.backend.daemon.driver.SQLDriverLocal.$anonfun$executeSql$1(SQLDriverLocal.scala:91)
scala.collection.immutable.List.map(List.scala:293)
com.databricks.backend.daemon.driver.SQLDriverLocal.executeSql(SQLDriverLocal.scala:37)
com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:144)
com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$13(DriverLocal.scala:564)
com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:215)
scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:213)
com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:210)
com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:50)
com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:251)
com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:243)
com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:50)
com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:541)
com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:693)
scala.util.Try$.apply(Try.scala:213)
com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:685)
com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:526)
com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:638)),org.apache.spark.sql.internal.SQLConf@3bbe45be,Some(CommandContext(Map(opId -> ServerBackend-9e8f8b1f2fc123a2, opTarget -> com.databricks.backend.common.rpc.InternalDriverBackendMessages$StartRepl, serverBackendName -> com.databricks.backend.daemon.driver.DriverCorral, notebookId -> 3706255681241255, projectName -> driver, eventWindowTime -> 4299928, httpTarget -> /websocket, buildHash -> e412bd9a983aa3092e4597bc59e5a87647c4c1ab, webSocketRpcMethod -> query, browserHash -> #notebook/3706255681241255/command/3706255681241256, host -> 10.172.227.195, notebookLanguage -> scala, hostName -> 1203-065518-2p3o92ah-10-172-227-195, httpMethod -> GET, browserIdleTime -> 1185, jettyRpcJettyVersion -> 9, browserTabId -> 67e9a168-7c78-4380-b7a9-bcebe4588332, sourceIpAddress -> 84.112.92.152, browserUserAgent -> Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:94.0) Gecko/20100101 Firefox/94.0, orgId -> 3567938889966569, userAgent -> Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:94.0) Gecko/20100101 Firefox/94.0, rootOpId -> ServiceMain-4e38355a1380003, sessionId -> ephemeral-e26c00a5-947a-46bb-a636-9dc35424aab9, clientBranchName -> 3.60.1638395955, browserHasFocus -> true, userId -> 8634867224347611, browserIsHidden -> false, opType -> ServerBackend, sourcePortNumber -> 49906, user -> michael.nitschinger@couchbase.com, browserHostName -> community.cloud.databricks.com, parentOpId -> RPCClient-4e38355a1380a86, jettyRpcType -> InternalDriverBackendMessages$DriverBackendRequest),Map(api_url -> https://community.cloud.databricks.com, api_token -> [REDACTED]))),== Parsed Logical Plan ==
ShowNamespaces [databaseName#4]
+- ResolvedNamespace com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog@6e5a4289
== Analyzed Logical Plan ==
databaseName: string
ShowNamespaces [databaseName#4]
+- ResolvedNamespace com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog@6e5a4289
== Optimized Logical Plan ==
ShowNamespaces [databaseName#4]
+- ResolvedNamespace com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog@6e5a4289
== Physical Plan ==
ShowNamespaces [databaseName#4], com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog@6e5a4289
,None,Some(databricks-sql-repl),Some(8876259256849300678_5273856758392610397_2bba7ea6-99b4-43ca-8f5e-e92e4a27cc41),None) by listener SQLAppStatusListener took 2.150349606s.
21/12/03 07:13:07 INFO ClusterLoadAvgHelper: Current cluster load: 0, Old Ema: 2.445, New Ema: 2.0782499999999997
21/12/03 07:13:07 INFO ProgressReporter$: Added result fetcher for 8876259256849300678_7196483158319821183_089ed89b-25a6-4193-9534-398c161e8b10
21/12/03 07:13:08 WARN SimpleFunctionRegistry: The function getargument replaced a previously registered function.
21/12/03 07:13:08 INFO ClusterLoadMonitor: Added query with execution ID:5. Current active queries:1
21/12/03 07:13:08 INFO HiveMetaStore: 1: get_database: global_temp
21/12/03 07:13:08 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: global_temp
21/12/03 07:13:08 ERROR RetryingHMSHandler: NoSuchObjectException(message:There is no database named global_temp)
at org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:487)
at org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:498)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)
at com.sun.proxy.$Proxy41.getDatabase(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_database(HiveMetaStore.java:796)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
at com.sun.proxy.$Proxy43.get_database(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:949)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
at com.sun.proxy.$Proxy44.getDatabase(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1165)
at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1154)
at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$databaseExists$1(HiveClientImpl.scala:436)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:345)
at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$retryLocked$1(HiveClientImpl.scala:246)
at org.apache.spark.sql.hive.client.HiveClientImpl.synchronizeOnObject(HiveClientImpl.scala:282)
at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:238)
at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:325)
at org.apache.spark.sql.hive.client.HiveClientImpl.databaseExists(HiveClientImpl.scala:436)
at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$databaseExists$1(PoolingHiveClient.scala:274)
at org.apache.spark.sql.hive.client.PoolingHiveClient.$anonfun$databaseExists$1$adapted(PoolingHiveClient.scala:273)
at org.apache.spark.sql.hive.client.PoolingHiveClient.withHiveClient(PoolingHiveClient.scala:112)
at org.apache.spark.sql.hive.client.PoolingHiveClient.databaseExists(PoolingHiveClient.scala:273)
at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:292)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:80)
at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$2(HiveExternalCatalog.scala:151)
at org.apache.spark.sql.hive.HiveExternalCatalog.maybeSynchronized(HiveExternalCatalog.scala:112)
at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$withClient$1(HiveExternalCatalog.scala:150)
at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:377)
at com.databricks.backend.daemon.driver.ProgressReporter$.withStatusCode(ProgressReporter.scala:363)
at com.databricks.spark.util.SparkDatabricksProgressReporter$.withStatusCode(ProgressReporter.scala:34)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:149)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:292)
at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.databaseExists(ExternalCatalogWithListener.scala:77)
at org.apache.spark.sql.internal.SharedState.$anonfun$globalTempViewManager$1(SharedState.scala:222)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at scala.util.Try$.apply(Try.scala:213)
at org.apache.spark.sql.internal.SharedState.globalTempViewManager$lzycompute(SharedState.scala:222)
at org.apache.spark.sql.internal.SharedState.globalTempViewManager(SharedState.scala:219)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$hiveCatalog$2(HiveSessionStateBuilder.scala:75)
at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.globalTempViewManager$lzycompute(SessionCatalog.scala:455)
at org.apache.spark.sql.catalyst.catalog.SessionCatalogImpl.globalTempViewManager(SessionCatalog.scala:455)
at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.globalTempViewManagerName$lzycompute(ManagedCatalogSessionCatalog.scala:76)
at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.globalTempViewManagerName(ManagedCatalogSessionCatalog.scala:75)
at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.doListTables(ManagedCatalogSessionCatalog.scala:891)
at com.databricks.sql.managedcatalog.ManagedCatalogSessionCatalog.doListTables(ManagedCatalogSessionCatalog.scala:930)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.listTables(SessionCatalog.scala:272)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.listTables(SessionCatalog.scala:263)
at org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.listTables(V2SessionCatalog.scala:57)
at org.apache.spark.sql.connector.catalog.DelegatingCatalogExtension.listTables(DelegatingCatalogExtension.java:65)
at org.apache.spark.sql.execution.datasources.v2.ShowTablesExec.run(ShowTablesExec.scala:42)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43)
at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:130)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$5(SQLExecution.scala:169)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:316)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:143)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:854)
at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:113)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:266)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:130)
at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:126)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:485)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:86)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:485)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:268)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:264)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:461)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:126)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:324)
at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:126)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:111)
at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:102)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:225)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:104)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:854)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:101)
at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:689)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:854)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:684)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:695)
at com.databricks.backend.daemon.driver.SQLDriverLocal.$anonfun$executeSql$1(SQLDriverLocal.scala:91)
at scala.collection.immutable.List.map(List.scala:293)
at com.databricks.backend.daemon.driver.SQLDriverLocal.executeSql(SQLDriverLocal.scala:37)
at com.databricks.backend.daemon.driver.SQLDriverLocal.repl(SQLDriverLocal.scala:144)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$13(DriverLocal.scala:564)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:215)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:213)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:210)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:50)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:251)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:243)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:50)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:541)
at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:693)
at scala.util.Try$.apply(Try.scala:213)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:685)
at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:526)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:638)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:431)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:374)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:225)
at java.lang.Thread.run(Thread.java:748)
21/12/03 07:13:08 INFO HiveMetaStore: 1: get_database: default
21/12/03 07:13:08 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: default
21/12/03 07:13:08 INFO HiveMetaStore: 1: get_database: default
21/12/03 07:13:08 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_database: default
21/12/03 07:13:08 INFO HiveMetaStore: 1: get_tables: db=default pat=*
21/12/03 07:13:08 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_tables: db=default pat=*
21/12/03 07:13:08 INFO ClusterLoadMonitor: Removed query with execution ID:5. Current active queries:0
21/12/03 07:13:08 INFO ClusterLoadMonitor: Added query with execution ID:6. Current active queries:1
21/12/03 07:13:08 INFO ClusterLoadMonitor: Removed query with execution ID:6. Current active queries:0
21/12/03 07:13:08 INFO ProgressReporter$: Removed result fetcher for 8876259256849300678_7196483158319821183_089ed89b-25a6-4193-9534-398c161e8b10
21/12/03 07:13:08 INFO DriverILoop: Set class prefix to: $line179927e71e7742c8833b12b3ff0bb1b0
21/12/03 07:13:08 INFO DriverILoop: set ContextClassLoader
21/12/03 07:13:08 INFO DriverILoop: initialized intp
21/12/03 07:13:10 INFO ClusterLoadAvgHelper: Current cluster load: 0, Old Ema: 2.0782499999999997, New Ema: 1.7665124999999997
21/12/03 07:13:13 INFO ClusterLoadAvgHelper: Current cluster load: 0, Old Ema: 1.7665124999999997, New Ema: 1.5015356249999998
21/12/03 07:13:14 INFO PythonDriverWrapper: setupRepl:ReplId-2e435-4d4b2-83855-c: finished to load
21/12/03 07:13:14 INFO ProgressReporter$: Added result fetcher for 3333601421689062748_6838960124352719855_70bec9d9-16de-45ef-870e-cb47a68f2cb0
21/12/03 07:13:16 INFO ClusterLoadAvgHelper: Current cluster load: 0, Old Ema: 1.5015356249999998, New Ema: 1.2763052812499998
21/12/03 07:13:18 INFO ProgressReporter$: Removed result fetcher for 3333601421689062748_6838960124352719855_70bec9d9-16de-45ef-870e-cb47a68f2cb0
21/12/03 07:13:19 INFO ClusterLoadAvgHelper: Current cluster load: 0, Old Ema: 1.2763052812499998, New Ema: 1.0848594890624998
21/12/03 07:13:22 INFO ClusterLoadAvgHelper: Current cluster load: 0, Old Ema: 1.0848594890624998, New Ema: 0.9221305657031248
21/12/03 07:13:25 INFO ClusterLoadAvgHelper: Current cluster load: 0, Old Ema: 0.9221305657031248, New Ema: 0.0
21/12/03 07:13:27 INFO core: [com.couchbase.core][CoreCreatedEvent] {"clientVersion":null,"clientGitHash":null,"coreVersion":null,"coreGitHash":null,"userAgent":"couchbase-scala/0.0.0 (Linux 5.4.0-1058-aws amd64; OpenJDK 64-Bit Server VM 1.8.0_302-b08)","maxNumRequestsInRetry":32768,"ioEnvironment":{"nativeIoEnabled":true,"eventLoopThreadCount":2,"eventLoopGroups":["EpollEventLoopGroup"]},"ioConfig":{"captureTraffic":[],"mutationTokensEnabled":true,"networkResolution":"auto","dnsSrvEnabled":true,"tcpKeepAlivesEnabled":true,"tcpKeepAliveTimeMs":60000,"configPollIntervalMs":2500,"kvCircuitBreakerConfig":"disabled","queryCircuitBreakerConfig":"disabled","viewCircuitBreakerConfig":"disabled","searchCircuitBreakerConfig":"disabled","analyticsCircuitBreakerConfig":"disabled","managerCircuitBreakerConfig":"disabled","eventingCircuitBreakerConfig":"disabled","numKvConnections":1,"maxHttpConnections":12,"idleHttpConnectionTimeoutMs":4500,"configIdleRedialTimeoutMs":300000,"memcachedHashingStrategy":"StandardMemcachedHashingStrategy"},"compressionConfig":{"enabled":true,"minRatio":0.83,"minSize":32},"securityConfig":{"tlsEnabled":false,"nativeTlsEnabled":true,"hostnameVerificationEnabled":true,"hasTrustCertificates":false,"trustManagerFactory":null,"ciphers":[]},"timeoutConfig":{"kvMs":2500,"kvDurableMs":10000,"managementMs":75000,"queryMs":75000,"viewMs":75000,"searchMs":75000,"analyticsMs":75000,"connectMs":10000,"disconnectMs":10000,"eventingMs":75000},"loggerConfig":{"customLogger":null,"fallbackToConsole":false,"consoleLogLevel":{"name":"INFO","resourceBundleName":"sun.util.logging.resources.logging","localizedName":"INFO"},"disableSlf4j":false,"loggerName":"CouchbaseLogger","diagnosticContextEnabled":false},"orphanReporterConfig":{"emitIntervalMs":10000,"sampleSize":10,"queueLength":1024,"enabled":true},"thresholdLoggingTracerConfig":{"enabled":true,"emitIntervalMs":10000,"sampleSize":10,"queueLength":1024,"kvThresholdMs":500,"queryThresholdMs":1000,"searchThresholdMs":1000,"analyticsThresholdMs":1000,"viewThresholdMs":1000},"loggingMeterConfig":{"enabled":true,"emitIntervalMs":600000},"retryStrategy":"BestEffortRetryStrategy","requestTracer":"ThresholdLoggingTracer","meter":"LoggingMeter","numRequestCallbacks":0} {"coreId":"0x4879041300000001","seedNodes":[{"address":"myhostname"}]}
21/12/03 07:13:27 INFO node: [com.couchbase.node][NodeConnectedEvent] Node connected {"coreId":"0x4879041300000001","managerPort":"8091","remote":"myhostname"}
21/12/03 07:13:31 INFO CodeGenerator: Code generated in 69.663012 ms
21/12/03 07:13:31 INFO CodeGenerator: Code generated in 21.453944 ms
21/12/03 07:13:31 INFO SparkContext: Starting job: json at QueryTableProvider.scala:87
21/12/03 07:13:32 INFO DAGScheduler: Got job 0 (json at QueryTableProvider.scala:87) with 8 output partitions
21/12/03 07:13:32 INFO DAGScheduler: Final stage: ResultStage 0 (json at QueryTableProvider.scala:87)
21/12/03 07:13:32 INFO DAGScheduler: Parents of final stage: List()
21/12/03 07:13:32 INFO DAGScheduler: Missing parents: List()
21/12/03 07:13:32 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[3] at json at QueryTableProvider.scala:87), which has no missing parents
21/12/03 07:13:32 INFO DAGScheduler: Jars for session None: Map(spark://10.172.227.195:33127/jars/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar -> 1638515161968)
21/12/03 07:13:32 INFO DAGScheduler: Files for session None: Map(file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar -> 1638515161870)
21/12/03 07:13:32 INFO DAGScheduler: Archives for session None: Map()
21/12/03 07:13:32 INFO DAGScheduler: Submitting 8 missing tasks from ResultStage 0 (MapPartitionsRDD[3] at json at QueryTableProvider.scala:87) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7))
21/12/03 07:13:32 INFO TaskSchedulerImpl: Adding task set 0.0 with 8 tasks resource profile 0
21/12/03 07:13:32 WARN FairSchedulableBuilder: A job was submitted with scheduler pool 2724283203918366252, which has not been configured. This can happen when the file that pools are read from isn't set, or when that file doesn't contain 2724283203918366252. Created 2724283203918366252 with default configuration (schedulingMode: FIFO, minShare: 0, weight: 1)
21/12/03 07:13:32 INFO FairSchedulableBuilder: Added task set TaskSet_0.0 tasks to pool 2724283203918366252
21/12/03 07:13:32 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0) (ip-10-172-227-195.us-west-2.compute.internal, executor driver, partition 0, PROCESS_LOCAL, taskResourceAssignments Map())
21/12/03 07:13:32 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1) (ip-10-172-227-195.us-west-2.compute.internal, executor driver, partition 1, PROCESS_LOCAL, taskResourceAssignments Map())
21/12/03 07:13:32 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2) (ip-10-172-227-195.us-west-2.compute.internal, executor driver, partition 2, PROCESS_LOCAL, taskResourceAssignments Map())
21/12/03 07:13:32 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3) (ip-10-172-227-195.us-west-2.compute.internal, executor driver, partition 3, PROCESS_LOCAL, taskResourceAssignments Map())
21/12/03 07:13:32 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4) (ip-10-172-227-195.us-west-2.compute.internal, executor driver, partition 4, PROCESS_LOCAL, taskResourceAssignments Map())
21/12/03 07:13:32 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5) (ip-10-172-227-195.us-west-2.compute.internal, executor driver, partition 5, PROCESS_LOCAL, taskResourceAssignments Map())
21/12/03 07:13:32 INFO TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6) (ip-10-172-227-195.us-west-2.compute.internal, executor driver, partition 6, PROCESS_LOCAL, taskResourceAssignments Map())
21/12/03 07:13:32 INFO TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7) (ip-10-172-227-195.us-west-2.compute.internal, executor driver, partition 7, PROCESS_LOCAL, taskResourceAssignments Map())
21/12/03 07:13:32 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 192.2 KiB, free 3.9 GiB)
21/12/03 07:13:32 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 43.8 KiB, free 3.9 GiB)
21/12/03 07:13:32 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.172.227.195:37965 (size: 43.8 KiB, free: 3.9 GiB)
21/12/03 07:13:32 INFO SparkContext: Created broadcast 0 from broadcast at TaskSetManager.scala:613
21/12/03 07:13:33 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
21/12/03 07:13:33 INFO Executor: Running task 4.0 in stage 0.0 (TID 4)
21/12/03 07:13:33 INFO Executor: Running task 3.0 in stage 0.0 (TID 3)
21/12/03 07:13:33 INFO Executor: Running task 5.0 in stage 0.0 (TID 5)
21/12/03 07:13:33 INFO Executor: Running task 2.0 in stage 0.0 (TID 2)
21/12/03 07:13:33 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
21/12/03 07:13:33 INFO Executor: Running task 6.0 in stage 0.0 (TID 6)
21/12/03 07:13:33 INFO Executor: Running task 7.0 in stage 0.0 (TID 7)
21/12/03 07:13:33 INFO Executor: Fetching file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar with timestamp 1638515161870
21/12/03 07:13:33 INFO Utils: /local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar has been previously copied to /local_disk0/spark-2d2bac26-5d3c-4daf-8661-3eef5f46af82/userFiles-a66bdc58-cc3f-433e-b825-3bf0a0fe1651/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar
21/12/03 07:13:33 INFO Executor: Fetching spark://10.172.227.195:33127/jars/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar with timestamp 1638515161968
21/12/03 07:13:33 INFO TransportClientFactory: Successfully created connection to /10.172.227.195:33127 after 71 ms (0 ms spent in bootstraps)
21/12/03 07:13:33 INFO Utils: Fetching spark://10.172.227.195:33127/jars/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar to /local_disk0/spark-2d2bac26-5d3c-4daf-8661-3eef5f46af82/userFiles-a66bdc58-cc3f-433e-b825-3bf0a0fe1651/fetchFileTemp936817662970870172.tmp
21/12/03 07:13:34 INFO Utils: /local_disk0/spark-2d2bac26-5d3c-4daf-8661-3eef5f46af82/userFiles-a66bdc58-cc3f-433e-b825-3bf0a0fe1651/fetchFileTemp936817662970870172.tmp has been previously copied to /local_disk0/spark-2d2bac26-5d3c-4daf-8661-3eef5f46af82/userFiles-a66bdc58-cc3f-433e-b825-3bf0a0fe1651/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar
21/12/03 07:13:34 INFO Executor: Adding file:/local_disk0/spark-2d2bac26-5d3c-4daf-8661-3eef5f46af82/userFiles-a66bdc58-cc3f-433e-b825-3bf0a0fe1651/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar to class loader for default
21/12/03 07:13:35 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:13:35 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:13:35 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:13:35 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:13:35 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:13:35 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:13:35 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:13:35 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
21/12/03 07:13:35 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1747 bytes result sent to driver
21/12/03 07:13:35 INFO Executor: Finished task 5.0 in stage 0.0 (TID 5). 1747 bytes result sent to driver
21/12/03 07:13:35 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1747 bytes result sent to driver
21/12/03 07:13:35 INFO Executor: Finished task 7.0 in stage 0.0 (TID 7). 1747 bytes result sent to driver
21/12/03 07:13:35 INFO Executor: Finished task 3.0 in stage 0.0 (TID 3). 1747 bytes result sent to driver
21/12/03 07:13:35 INFO Executor: Finished task 2.0 in stage 0.0 (TID 2). 1747 bytes result sent to driver
21/12/03 07:13:35 INFO Executor: Finished task 4.0 in stage 0.0 (TID 4). 1747 bytes result sent to driver
21/12/03 07:13:35 INFO Executor: Finished task 6.0 in stage 0.0 (TID 6). 1747 bytes result sent to driver
21/12/03 07:13:35 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 3220 ms on ip-10-172-227-195.us-west-2.compute.internal (executor driver) (1/8)
21/12/03 07:13:35 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 3229 ms on ip-10-172-227-195.us-west-2.compute.internal (executor driver) (2/8)
21/12/03 07:13:35 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 3312 ms on ip-10-172-227-195.us-west-2.compute.internal (executor driver) (3/8)
21/12/03 07:13:35 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 3242 ms on ip-10-172-227-195.us-west-2.compute.internal (executor driver) (4/8)
21/12/03 07:13:35 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 3242 ms on ip-10-172-227-195.us-west-2.compute.internal (executor driver) (5/8)
21/12/03 07:13:35 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 3246 ms on ip-10-172-227-195.us-west-2.compute.internal (executor driver) (6/8)
21/12/03 07:13:35 INFO TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 3234 ms on ip-10-172-227-195.us-west-2.compute.internal (executor driver) (7/8)
21/12/03 07:13:35 INFO TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 3241 ms on ip-10-172-227-195.us-west-2.compute.internal (executor driver) (8/8)
21/12/03 07:13:35 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 2724283203918366252
21/12/03 07:13:35 INFO DAGScheduler: ResultStage 0 (json at QueryTableProvider.scala:87) finished in 3.670 s
21/12/03 07:13:36 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job
21/12/03 07:13:36 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished
21/12/03 07:13:36 INFO DAGScheduler: Job 0 finished: json at QueryTableProvider.scala:87, took 4.104747 s
21/12/03 07:13:36 INFO CodeGenerator: Code generated in 20.11015 ms
21/12/03 07:13:36 INFO ClusterLoadMonitor: Added query with execution ID:7. Current active queries:1
21/12/03 07:13:36 INFO V2ScanRelationPushDown:
Output: callsign#50, name#55
21/12/03 07:13:38 INFO ClusterLoadAvgHelper: Current cluster load: 1, Old Ema: 0.0, New Ema: 1.0
21/12/03 07:13:38 INFO BlockManagerInfo: Removed broadcast_0_piece0 on 10.172.227.195:37965 in memory (size: 43.8 KiB, free: 3.9 GiB)
21/12/03 07:13:38 INFO CodeGenerator: Code generated in 88.774172 ms
21/12/03 07:13:38 INFO CodeGenerator: Code generated in 25.853437 ms
21/12/03 07:13:38 INFO SparkContext: Starting job: show at command-3706255681241256:17
21/12/03 07:13:38 INFO DAGScheduler: Got job 1 (show at command-3706255681241256:17) with 1 output partitions
21/12/03 07:13:38 INFO DAGScheduler: Final stage: ResultStage 1 (show at command-3706255681241256:17)
21/12/03 07:13:38 INFO DAGScheduler: Parents of final stage: List()
21/12/03 07:13:38 INFO DAGScheduler: Missing parents: List()
21/12/03 07:13:38 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[13] at show at command-3706255681241256:17), which has no missing parents
21/12/03 07:13:38 INFO DAGScheduler: Jars for session None: Map(spark://10.172.227.195:33127/jars/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar -> 1638515161968)
21/12/03 07:13:38 INFO DAGScheduler: Files for session None: Map(file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar -> 1638515161870)
21/12/03 07:13:38 INFO DAGScheduler: Archives for session None: Map()
21/12/03 07:13:38 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[13] at show at command-3706255681241256:17) (first 15 tasks are for partitions Vector(0))
21/12/03 07:13:38 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks resource profile 0
21/12/03 07:13:38 WARN FairSchedulableBuilder: A job was submitted with scheduler pool 2724283203918366252, which has not been configured. This can happen when the file that pools are read from isn't set, or when that file doesn't contain 2724283203918366252. Created 2724283203918366252 with default configuration (schedulingMode: FIFO, minShare: 0, weight: 1)
21/12/03 07:13:38 INFO FairSchedulableBuilder: Added task set TaskSet_1.0 tasks to pool 2724283203918366252
21/12/03 07:13:38 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 8) (ip-10-172-227-195.us-west-2.compute.internal, executor driver, partition 0, ANY, taskResourceAssignments Map())
21/12/03 07:13:38 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 77.1 KiB, free 3.9 GiB)
21/12/03 07:13:38 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 16.3 KiB, free 3.9 GiB)
21/12/03 07:13:38 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 10.172.227.195:37965 (size: 16.3 KiB, free: 3.9 GiB)
21/12/03 07:13:38 INFO SparkContext: Created broadcast 1 from broadcast at TaskSetManager.scala:613
21/12/03 07:13:38 INFO Executor: Running task 0.0 in stage 1.0 (TID 8)
21/12/03 07:13:38 INFO CodeGenerator: Code generated in 31.788252 ms
21/12/03 07:13:38 INFO CodeGenerator: Code generated in 22.865897 ms
21/12/03 07:13:39 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 8)
java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.json.CreateJacksonParser$.string(Lcom/fasterxml/jackson/core/JsonFactory;Ljava/lang/String;)Lcom/fasterxml/jackson/core/JsonParser;
at org.apache.spark.sql.CouchbaseJsonUtils$.$anonfun$createParser$1(CouchbaseJsonUtils.scala:41)
at org.apache.spark.sql.catalyst.json.JacksonParser.$anonfun$parse$1(JacksonParser.scala:490)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2965)
at org.apache.spark.sql.catalyst.json.JacksonParser.parse(JacksonParser.scala:490)
at com.couchbase.spark.query.QueryPartitionReader.$anonfun$rows$2(QueryPartitionReader.scala:54)
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293)
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
at com.couchbase.spark.query.QueryPartitionReader.rows$lzycompute(QueryPartitionReader.scala:52)
at com.couchbase.spark.query.QueryPartitionReader.rows(QueryPartitionReader.scala:49)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator$lzycompute(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.next(QueryPartitionReader.scala:101)
at org.apache.spark.sql.execution.datasources.v2.PartitionIterator.hasNext(DataSourceRDD.scala:93)
at org.apache.spark.sql.execution.datasources.v2.MetricsIterator.hasNext(DataSourceRDD.scala:130)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:758)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:32)
at com.google.common.collect.Ordering.leastOf(Ordering.java:649)
at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
at org.apache.spark.rdd.RDD.$anonfun$takeOrdered$2(RDD.scala:1533)
at org.apache.spark.SparkContext.$anonfun$runJob$3(SparkContext.scala:2588)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$3(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$1(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:55)
at org.apache.spark.scheduler.Task.doRunTask(Task.scala:153)
at org.apache.spark.scheduler.Task.$anonfun$run$1(Task.scala:122)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.Task.run(Task.scala:93)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$13(Executor.scala:824)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1621)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:827)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:683)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
21/12/03 07:13:39 WARN TaskSetManager: Lost task 0.0 in stage 1.0 (TID 8) (ip-10-172-227-195.us-west-2.compute.internal executor driver): java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.json.CreateJacksonParser$.string(Lcom/fasterxml/jackson/core/JsonFactory;Ljava/lang/String;)Lcom/fasterxml/jackson/core/JsonParser;
at org.apache.spark.sql.CouchbaseJsonUtils$.$anonfun$createParser$1(CouchbaseJsonUtils.scala:41)
at org.apache.spark.sql.catalyst.json.JacksonParser.$anonfun$parse$1(JacksonParser.scala:490)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2965)
at org.apache.spark.sql.catalyst.json.JacksonParser.parse(JacksonParser.scala:490)
at com.couchbase.spark.query.QueryPartitionReader.$anonfun$rows$2(QueryPartitionReader.scala:54)
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293)
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
at com.couchbase.spark.query.QueryPartitionReader.rows$lzycompute(QueryPartitionReader.scala:52)
at com.couchbase.spark.query.QueryPartitionReader.rows(QueryPartitionReader.scala:49)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator$lzycompute(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.next(QueryPartitionReader.scala:101)
at org.apache.spark.sql.execution.datasources.v2.PartitionIterator.hasNext(DataSourceRDD.scala:93)
at org.apache.spark.sql.execution.datasources.v2.MetricsIterator.hasNext(DataSourceRDD.scala:130)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:758)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:32)
at com.google.common.collect.Ordering.leastOf(Ordering.java:649)
at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
at org.apache.spark.rdd.RDD.$anonfun$takeOrdered$2(RDD.scala:1533)
at org.apache.spark.SparkContext.$anonfun$runJob$3(SparkContext.scala:2588)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$3(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$1(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:55)
at org.apache.spark.scheduler.Task.doRunTask(Task.scala:153)
at org.apache.spark.scheduler.Task.$anonfun$run$1(Task.scala:122)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.Task.run(Task.scala:93)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$13(Executor.scala:824)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1621)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:827)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:683)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
21/12/03 07:13:39 ERROR TaskSetManager: Task 0 in stage 1.0 failed 1 times; aborting job
21/12/03 07:13:39 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 2724283203918366252
21/12/03 07:13:39 INFO TaskSchedulerImpl: Cancelling stage 1
21/12/03 07:13:39 INFO TaskSchedulerImpl: Killing all running tasks in stage 1: Stage cancelled
21/12/03 07:13:39 INFO DAGScheduler: ResultStage 1 (show at command-3706255681241256:17) failed in 1.080 s due to Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 8) (ip-10-172-227-195.us-west-2.compute.internal executor driver): java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.json.CreateJacksonParser$.string(Lcom/fasterxml/jackson/core/JsonFactory;Ljava/lang/String;)Lcom/fasterxml/jackson/core/JsonParser;
at org.apache.spark.sql.CouchbaseJsonUtils$.$anonfun$createParser$1(CouchbaseJsonUtils.scala:41)
at org.apache.spark.sql.catalyst.json.JacksonParser.$anonfun$parse$1(JacksonParser.scala:490)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2965)
at org.apache.spark.sql.catalyst.json.JacksonParser.parse(JacksonParser.scala:490)
at com.couchbase.spark.query.QueryPartitionReader.$anonfun$rows$2(QueryPartitionReader.scala:54)
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293)
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
at com.couchbase.spark.query.QueryPartitionReader.rows$lzycompute(QueryPartitionReader.scala:52)
at com.couchbase.spark.query.QueryPartitionReader.rows(QueryPartitionReader.scala:49)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator$lzycompute(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.next(QueryPartitionReader.scala:101)
at org.apache.spark.sql.execution.datasources.v2.PartitionIterator.hasNext(DataSourceRDD.scala:93)
at org.apache.spark.sql.execution.datasources.v2.MetricsIterator.hasNext(DataSourceRDD.scala:130)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:758)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:32)
at com.google.common.collect.Ordering.leastOf(Ordering.java:649)
at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
at org.apache.spark.rdd.RDD.$anonfun$takeOrdered$2(RDD.scala:1533)
at org.apache.spark.SparkContext.$anonfun$runJob$3(SparkContext.scala:2588)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$3(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$1(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:55)
at org.apache.spark.scheduler.Task.doRunTask(Task.scala:153)
at org.apache.spark.scheduler.Task.$anonfun$run$1(Task.scala:122)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.Task.run(Task.scala:93)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$13(Executor.scala:824)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1621)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:827)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:683)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
21/12/03 07:13:39 INFO DAGScheduler: Job 1 failed: show at command-3706255681241256:17, took 1.112161 s
21/12/03 07:13:39 INFO ClusterLoadMonitor: Removed query with execution ID:7. Current active queries:0
21/12/03 07:13:39 ERROR ScalaDriverLocal: User Code Stack Trace:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 8) (ip-10-172-227-195.us-west-2.compute.internal executor driver): java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.json.CreateJacksonParser$.string(Lcom/fasterxml/jackson/core/JsonFactory;Ljava/lang/String;)Lcom/fasterxml/jackson/core/JsonParser;
at org.apache.spark.sql.CouchbaseJsonUtils$.$anonfun$createParser$1(CouchbaseJsonUtils.scala:41)
at org.apache.spark.sql.catalyst.json.JacksonParser.$anonfun$parse$1(JacksonParser.scala:490)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2965)
at org.apache.spark.sql.catalyst.json.JacksonParser.parse(JacksonParser.scala:490)
at com.couchbase.spark.query.QueryPartitionReader.$anonfun$rows$2(QueryPartitionReader.scala:54)
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293)
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
at com.couchbase.spark.query.QueryPartitionReader.rows$lzycompute(QueryPartitionReader.scala:52)
at com.couchbase.spark.query.QueryPartitionReader.rows(QueryPartitionReader.scala:49)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator$lzycompute(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.next(QueryPartitionReader.scala:101)
at org.apache.spark.sql.execution.datasources.v2.PartitionIterator.hasNext(DataSourceRDD.scala:93)
at org.apache.spark.sql.execution.datasources.v2.MetricsIterator.hasNext(DataSourceRDD.scala:130)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:758)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:32)
at com.google.common.collect.Ordering.leastOf(Ordering.java:649)
at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
at org.apache.spark.rdd.RDD.$anonfun$takeOrdered$2(RDD.scala:1533)
at org.apache.spark.SparkContext.$anonfun$runJob$3(SparkContext.scala:2588)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$3(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$1(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:55)
at org.apache.spark.scheduler.Task.doRunTask(Task.scala:153)
at org.apache.spark.scheduler.Task.$anonfun$run$1(Task.scala:122)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.Task.run(Task.scala:93)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$13(Executor.scala:824)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1621)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:827)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:683)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2973)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2920)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2914)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2914)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1334)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1334)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1334)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:3182)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:3123)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:3111)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:1096)
at org.apache.spark.SparkContext.runJobInternal(SparkContext.scala:2494)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2477)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2589)
at org.apache.spark.rdd.RDD.$anonfun$takeOrdered$1(RDD.scala:1548)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:165)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:125)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:419)
at org.apache.spark.rdd.RDD.takeOrdered(RDD.scala:1524)
at org.apache.spark.sql.execution.TakeOrderedAndProjectExec.executeCollect(limit.scala:213)
at org.apache.spark.sql.execution.collect.Collector$.collect(Collector.scala:68)
at org.apache.spark.sql.execution.collect.Collector$.collect(Collector.scala:87)
at org.apache.spark.sql.execution.collect.InternalRowFormat$.collect(cachedSparkResults.scala:75)
at org.apache.spark.sql.execution.collect.InternalRowFormat$.collect(cachedSparkResults.scala:62)
at org.apache.spark.sql.execution.ResultCacheManager.collectResult$1(ResultCacheManager.scala:547)
at org.apache.spark.sql.execution.ResultCacheManager.computeResult(ResultCacheManager.scala:554)
at org.apache.spark.sql.execution.ResultCacheManager.$anonfun$getOrComputeResultInternal$1(ResultCacheManager.scala:500)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.execution.ResultCacheManager.getOrComputeResultInternal(ResultCacheManager.scala:499)
at org.apache.spark.sql.execution.ResultCacheManager.getOrComputeResult(ResultCacheManager.scala:396)
at org.apache.spark.sql.execution.ResultCacheManager.getOrComputeResult(ResultCacheManager.scala:375)
at org.apache.spark.sql.execution.SparkPlan.executeCollectResult(SparkPlan.scala:408)
at org.apache.spark.sql.Dataset.collectResult(Dataset.scala:3102)
at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3908)
at org.apache.spark.sql.Dataset.$anonfun$head$1(Dataset.scala:2826)
at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3900)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$5(SQLExecution.scala:169)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:316)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:143)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:854)
at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:113)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:266)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3898)
at org.apache.spark.sql.Dataset.head(Dataset.scala:2826)
at org.apache.spark.sql.Dataset.take(Dataset.scala:3033)
at org.apache.spark.sql.Dataset.getRows(Dataset.scala:293)
at org.apache.spark.sql.Dataset.showString(Dataset.scala:332)
at org.apache.spark.sql.Dataset.show(Dataset.scala:821)
at org.apache.spark.sql.Dataset.show(Dataset.scala:780)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-3706255681241256:17)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-3706255681241256:66)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$$iw$$iw$$iw$$iw.<init>(command-3706255681241256:68)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$$iw$$iw$$iw.<init>(command-3706255681241256:70)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$$iw$$iw.<init>(command-3706255681241256:72)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$$iw.<init>(command-3706255681241256:74)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read.<init>(command-3706255681241256:76)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$.<init>(command-3706255681241256:80)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$.<clinit>(command-3706255681241256)
at $line179927e71e7742c8833b12b3ff0bb1b025.$eval$.$print$lzycompute(<notebook>:7)
at $line179927e71e7742c8833b12b3ff0bb1b025.$eval$.$print(<notebook>:6)
at $line179927e71e7742c8833b12b3ff0bb1b025.$eval.$print(<notebook>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568)
at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:219)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.$anonfun$repl$1(ScalaDriverLocal.scala:235)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:926)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:879)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:235)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$13(DriverLocal.scala:564)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:215)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:213)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:210)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:50)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:251)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:243)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:50)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:541)
at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:693)
at scala.util.Try$.apply(Try.scala:213)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:685)
at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:526)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:638)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:431)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:374)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:225)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.json.CreateJacksonParser$.string(Lcom/fasterxml/jackson/core/JsonFactory;Ljava/lang/String;)Lcom/fasterxml/jackson/core/JsonParser;
at org.apache.spark.sql.CouchbaseJsonUtils$.$anonfun$createParser$1(CouchbaseJsonUtils.scala:41)
at org.apache.spark.sql.catalyst.json.JacksonParser.$anonfun$parse$1(JacksonParser.scala:490)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2965)
at org.apache.spark.sql.catalyst.json.JacksonParser.parse(JacksonParser.scala:490)
at com.couchbase.spark.query.QueryPartitionReader.$anonfun$rows$2(QueryPartitionReader.scala:54)
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293)
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
at com.couchbase.spark.query.QueryPartitionReader.rows$lzycompute(QueryPartitionReader.scala:52)
at com.couchbase.spark.query.QueryPartitionReader.rows(QueryPartitionReader.scala:49)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator$lzycompute(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.next(QueryPartitionReader.scala:101)
at org.apache.spark.sql.execution.datasources.v2.PartitionIterator.hasNext(DataSourceRDD.scala:93)
at org.apache.spark.sql.execution.datasources.v2.MetricsIterator.hasNext(DataSourceRDD.scala:130)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:758)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:32)
at com.google.common.collect.Ordering.leastOf(Ordering.java:649)
at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
at org.apache.spark.rdd.RDD.$anonfun$takeOrdered$2(RDD.scala:1533)
at org.apache.spark.SparkContext.$anonfun$runJob$3(SparkContext.scala:2588)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$3(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$1(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:55)
at org.apache.spark.scheduler.Task.doRunTask(Task.scala:153)
at org.apache.spark.scheduler.Task.$anonfun$run$1(Task.scala:122)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.Task.run(Task.scala:93)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$13(Executor.scala:824)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1621)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:827)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:683)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
... 1 more
21/12/03 07:13:39 INFO ProgressReporter$: Removed result fetcher for 2724283203918366252_5083101478493670975_089e24cf239e4bed8856ba9d5de733bd
21/12/03 07:13:40 INFO ClusterLoadAvgHelper: Current cluster load: 0, Old Ema: 1.0, New Ema: 0.85
21/12/03 07:13:43 INFO ClusterLoadAvgHelper: Current cluster load: 0, Old Ema: 0.85, New Ema: 0.0
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
Fri Dec 3 07:13:11 2021 Connection to spark from PID 1192
Fri Dec 3 07:13:11 2021 Initialized gateway on port 40723
Fri Dec 3 07:13:12 2021 Connected to spark.
21/12/03 07:13:39 ERROR Uncaught throwable from user code: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 8) (ip-10-172-227-195.us-west-2.compute.internal executor driver): java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.json.CreateJacksonParser$.string(Lcom/fasterxml/jackson/core/JsonFactory;Ljava/lang/String;)Lcom/fasterxml/jackson/core/JsonParser;
at org.apache.spark.sql.CouchbaseJsonUtils$.$anonfun$createParser$1(CouchbaseJsonUtils.scala:41)
at org.apache.spark.sql.catalyst.json.JacksonParser.$anonfun$parse$1(JacksonParser.scala:490)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2965)
at org.apache.spark.sql.catalyst.json.JacksonParser.parse(JacksonParser.scala:490)
at com.couchbase.spark.query.QueryPartitionReader.$anonfun$rows$2(QueryPartitionReader.scala:54)
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293)
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
at com.couchbase.spark.query.QueryPartitionReader.rows$lzycompute(QueryPartitionReader.scala:52)
at com.couchbase.spark.query.QueryPartitionReader.rows(QueryPartitionReader.scala:49)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator$lzycompute(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.next(QueryPartitionReader.scala:101)
at org.apache.spark.sql.execution.datasources.v2.PartitionIterator.hasNext(DataSourceRDD.scala:93)
at org.apache.spark.sql.execution.datasources.v2.MetricsIterator.hasNext(DataSourceRDD.scala:130)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:758)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:32)
at com.google.common.collect.Ordering.leastOf(Ordering.java:649)
at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
at org.apache.spark.rdd.RDD.$anonfun$takeOrdered$2(RDD.scala:1533)
at org.apache.spark.SparkContext.$anonfun$runJob$3(SparkContext.scala:2588)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$3(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$1(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:55)
at org.apache.spark.scheduler.Task.doRunTask(Task.scala:153)
at org.apache.spark.scheduler.Task.$anonfun$run$1(Task.scala:122)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.Task.run(Task.scala:93)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$13(Executor.scala:824)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1621)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:827)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:683)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2973)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2920)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2914)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2914)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1334)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1334)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1334)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:3182)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:3123)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:3111)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:1096)
at org.apache.spark.SparkContext.runJobInternal(SparkContext.scala:2494)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2477)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2589)
at org.apache.spark.rdd.RDD.$anonfun$takeOrdered$1(RDD.scala:1548)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:165)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:125)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:419)
at org.apache.spark.rdd.RDD.takeOrdered(RDD.scala:1524)
at org.apache.spark.sql.execution.TakeOrderedAndProjectExec.executeCollect(limit.scala:213)
at org.apache.spark.sql.execution.collect.Collector$.collect(Collector.scala:68)
at org.apache.spark.sql.execution.collect.Collector$.collect(Collector.scala:87)
at org.apache.spark.sql.execution.collect.InternalRowFormat$.collect(cachedSparkResults.scala:75)
at org.apache.spark.sql.execution.collect.InternalRowFormat$.collect(cachedSparkResults.scala:62)
at org.apache.spark.sql.execution.ResultCacheManager.collectResult$1(ResultCacheManager.scala:547)
at org.apache.spark.sql.execution.ResultCacheManager.computeResult(ResultCacheManager.scala:554)
at org.apache.spark.sql.execution.ResultCacheManager.$anonfun$getOrComputeResultInternal$1(ResultCacheManager.scala:500)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.execution.ResultCacheManager.getOrComputeResultInternal(ResultCacheManager.scala:499)
at org.apache.spark.sql.execution.ResultCacheManager.getOrComputeResult(ResultCacheManager.scala:396)
at org.apache.spark.sql.execution.ResultCacheManager.getOrComputeResult(ResultCacheManager.scala:375)
at org.apache.spark.sql.execution.SparkPlan.executeCollectResult(SparkPlan.scala:408)
at org.apache.spark.sql.Dataset.collectResult(Dataset.scala:3102)
at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3908)
at org.apache.spark.sql.Dataset.$anonfun$head$1(Dataset.scala:2826)
at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3900)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$5(SQLExecution.scala:169)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:316)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:143)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:854)
at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:113)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:266)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3898)
at org.apache.spark.sql.Dataset.head(Dataset.scala:2826)
at org.apache.spark.sql.Dataset.take(Dataset.scala:3033)
at org.apache.spark.sql.Dataset.getRows(Dataset.scala:293)
at org.apache.spark.sql.Dataset.showString(Dataset.scala:332)
at org.apache.spark.sql.Dataset.show(Dataset.scala:821)
at org.apache.spark.sql.Dataset.show(Dataset.scala:780)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-3706255681241256:17)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-3706255681241256:66)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$$iw$$iw$$iw$$iw.<init>(command-3706255681241256:68)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$$iw$$iw$$iw.<init>(command-3706255681241256:70)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$$iw$$iw.<init>(command-3706255681241256:72)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$$iw.<init>(command-3706255681241256:74)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read.<init>(command-3706255681241256:76)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$.<init>(command-3706255681241256:80)
at $line179927e71e7742c8833b12b3ff0bb1b025.$read$.<clinit>(command-3706255681241256)
at $line179927e71e7742c8833b12b3ff0bb1b025.$eval$.$print$lzycompute(<notebook>:7)
at $line179927e71e7742c8833b12b3ff0bb1b025.$eval$.$print(<notebook>:6)
at $line179927e71e7742c8833b12b3ff0bb1b025.$eval.$print(<notebook>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568)
at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:219)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.$anonfun$repl$1(ScalaDriverLocal.scala:235)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:926)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:879)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:235)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$13(DriverLocal.scala:564)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:215)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:213)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:210)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:50)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:251)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:243)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:50)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:541)
at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:693)
at scala.util.Try$.apply(Try.scala:213)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:685)
at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:526)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:638)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:431)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:374)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:225)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.json.CreateJacksonParser$.string(Lcom/fasterxml/jackson/core/JsonFactory;Ljava/lang/String;)Lcom/fasterxml/jackson/core/JsonParser;
at org.apache.spark.sql.CouchbaseJsonUtils$.$anonfun$createParser$1(CouchbaseJsonUtils.scala:41)
at org.apache.spark.sql.catalyst.json.JacksonParser.$anonfun$parse$1(JacksonParser.scala:490)
at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2965)
at org.apache.spark.sql.catalyst.json.JacksonParser.parse(JacksonParser.scala:490)
at com.couchbase.spark.query.QueryPartitionReader.$anonfun$rows$2(QueryPartitionReader.scala:54)
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293)
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
at com.couchbase.spark.query.QueryPartitionReader.rows$lzycompute(QueryPartitionReader.scala:52)
at com.couchbase.spark.query.QueryPartitionReader.rows(QueryPartitionReader.scala:49)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator$lzycompute(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.rowIterator(QueryPartitionReader.scala:61)
at com.couchbase.spark.query.QueryPartitionReader.next(QueryPartitionReader.scala:101)
at org.apache.spark.sql.execution.datasources.v2.PartitionIterator.hasNext(DataSourceRDD.scala:93)
at org.apache.spark.sql.execution.datasources.v2.MetricsIterator.hasNext(DataSourceRDD.scala:130)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:758)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:32)
at com.google.common.collect.Ordering.leastOf(Ordering.java:649)
at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
at org.apache.spark.rdd.RDD.$anonfun$takeOrdered$2(RDD.scala:1533)
at org.apache.spark.SparkContext.$anonfun$runJob$3(SparkContext.scala:2588)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$3(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.$anonfun$runTask$1(ResultTask.scala:75)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:55)
at org.apache.spark.scheduler.Task.doRunTask(Task.scala:153)
at org.apache.spark.scheduler.Task.$anonfun$run$1(Task.scala:122)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.scheduler.Task.run(Task.scala:93)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$13(Executor.scala:824)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1621)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:827)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.spark.util.ExecutorFrameProfiler$.record(ExecutorFrameProfiler.scala:110)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:683)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
ureRequirementsDP$$Lambda$3579/116810772 0x00000007c1962028]
[Unloading class com.databricks.sql.optimizer.EnsureRequirementsDP$$Lambda$3577/1063291954 0x00000007c1961828]
[Unloading class com.databricks.sql.optimizer.EnsureRequirementsDP$$Lambda$3576/1309136790 0x00000007c1961428]
[Unloading class com.databricks.sql.optimizer.EnsureRequirementsDP$$Lambda$3574/2121797368 0x00000007c1960c28]
[Unloading class com.databricks.sql.optimizer.EnsureRequirementsDP$$$Lambda$3571/1541142878 0x00000007c1960028]
[Unloading class com.databricks.sql.optimizer.EnsureRequirementsDP$$$Lambda$3570/1508614028 0x00000007c195fc28]
[Unloading class com.databricks.sql.optimizer.EnsureRequirementsDP$$$Lambda$3569/1421835088 0x00000007c195f828]
[Unloading class com.databricks.sql.optimizer.EnsureRequirementsDP$$Lambda$3566/1735232456 0x00000007c195ec28]
[Unloading class org.apache.spark.sql.execution.exchange.EnsureRequirements$$Lambda$3560/327395893 0x00000007c195cc28]
[Unloading class org.apache.spark.sql.execution.exchange.EnsureRequirements$$Lambda$3559/1856835196 0x00000007c195c828]
[Unloading class org.apache.spark.sql.execution.exchange.EnsureRequirements$$Lambda$3557/1692364997 0x00000007c195b028]
[Unloading class org.apache.spark.sql.execution.exchange.EnsureRequirements$$Lambda$3555/2098206234 0x00000007c195bc28]
[Unloading class org.apache.spark.sql.execution.exchange.EnsureRequirements$$Lambda$3554/19701296 0x00000007c1959c28]
[Unloading class org.apache.spark.sql.execution.exchange.EnsureRequirements$$Lambda$3552/995604649 0x00000007c1959428]
[Unloading class org.apache.spark.sql.execution.exchange.EnsureRequirements$$Lambda$3551/1526957254 0x00000007c1958c28]
[Unloading class org.apache.spark.sql.execution.SparkPlan$$Lambda$3545/755507448 0x00000007c1958828]
[Unloading class org.apache.spark.sql.execution.exchange.EnsureRequirements$$Lambda$3548/1034273133 0x00000007c1958028]
[Unloading class org.apache.spark.sql.execution.SparkPlan$$Lambda$3547/372829123 0x00000007c1957828]
[Unloading class org.apache.spark.sql.execution.SparkPlan$$Lambda$3543/474414273 0x00000007c1957028]
[Unloading class org.apache.spark.sql.execution.RemoveRedundantProjects$$$Lambda$3542/1774439702 0x00000007c1956c28]
[Unloading class org.apache.spark.sql.execution.RemoveRedundantProjects$$$Lambda$3541/1729660380 0x00000007c1956828]
[Unloading class org.apache.spark.sql.execution.PlanSubqueries$$Lambda$3539/651166248 0x00000007c194dc28]
[Unloading class org.apache.spark.sql.execution.QueryExecution$$$Lambda$3537/456799968 0x00000007c194d828]
[Unloading class org.apache.spark.sql.execution.QueryExecution$$$Lambda$3535/1498773283 0x00000007c194c428]
[Unloading class org.apache.spark.sql.execution.QueryExecution$$$Lambda$3533/1384298653 0x00000007c193b828]
[Unloading class org.apache.spark.sql.execution.QueryExecution$$Lambda$3530/242083686 0x00000007c193b028]
[Unloading class org.apache.spark.sql.catalyst.planning.QueryPlanner$$Lambda$3527/478464630 0x00000007c193a028]
[Unloading class org.apache.spark.sql.execution.SparkPlan$$$Lambda$3522/1178390432 0x00000007c1938428]
[Unloading class org.apache.spark.sql.catalyst.planning.QueryPlanner$$Lambda$3518/1630008079 0x00000007c1927c28]
[Unloading class org.apache.spark.sql.catalyst.planning.QueryPlanner$$Lambda$3517/2027335940 0x00000007c1927028]
[Unloading class org.apache.spark.sql.catalyst.planning.QueryPlanner$$Lambda$3515/898123310 0x00000007c1926828]
[Unloading class org.apache.spark.sql.SparkSessionExtensions$$Lambda$3512/1038330703 0x00000007c1925c28]
[Unloading class org.apache.spark.sql.execution.QueryExecution$$Lambda$3511/918760056 0x00000007c191d828]
[Unloading class org.apache.spark.sql.execution.dynamicpruning.CleanupDynamicPruningFilters$$$Lambda$3507/7295141 0x00000007c191cc28]
[Unloading class org.apache.spark.sql.execution.python.ExtractPythonUDFs$$$Lambda$3505/1361995748 0x00000007c191c028]
[Unloading class org.apache.spark.sql.execution.python.ExtractPythonUDFs$$$Lambda$3504/1296834034 0x00000007c191bc28]
[Unloading class org.apache.spark.sql.execution.python.ExtractGroupingPythonUDFFromAggregate$$$Lambda$3503/980291656 0x00000007c191b428]
[Unloading class org.apache.spark.sql.execution.python.ExtractGroupingPythonUDFFromAggregate$$$Lambda$3502/1930183248 0x00000007c191b828]
[Unloading class org.apache.spark.sql.execution.python.ExtractPythonUDFFromAggregate$$$Lambda$3499/963572037 0x00000007c191a828]
[Unloading class org.apache.spark.sql.execution.python.ExtractPythonUDFFromAggregate$$$Lambda$3498/208792147 0x00000007c191a428]
[Unloading class org.apache.spark.sql.catalyst.optimizer.ExtractPythonUDFFromJoinCondition$$$Lambda$3496/636798069 0x00000007c1919c28]
[Unloading class com.databricks.sql.optimizer.ExtractPythonUDFFromWindow$$$Lambda$3495/799937955 0x00000007c1919828]
[Unloading class org.apache.spark.sql.catalyst.plans.QueryPlan$$Lambda$3493/347868949 0x00000007c1919028]
[Unloading class com.databricks.sql.transaction.tahoe.stats.PrepareDeltaScan$$Lambda$3489/1360988715 0x00000007c1918028]
[Unloading class com.databricks.sql.transaction.tahoe.stats.PrepareDeltaScan$$Lambda$3488/518604303 0x00000007c1917c28]
[Unloading class com.databricks.sql.transaction.tahoe.stats.PrepareDeltaScan$$Lambda$3486/649747891 0x00000007c1917428]
[Unloading class org.apache.spark.sql.catalyst.plans.QueryPlan$$Lambda$3484/564984568 0x00000007c1916c28]
[Unloading class org.apache.spark.sql.catalyst.trees.TreeNode$$Lambda$3482/903410659 0x00000007c1916428]
[Unloading class org.apache.spark.sql.catalyst.plans.QueryPlan$$Lambda$3479/1083116703 0x00000007c1915828]
[Unloading class com.databricks.sql.transaction.tahoe.stats.PrepareDeltaScan$$Lambda$3478/1726267999 0x00000007c1915028]
[Unloading class com.databricks.sql.expressions.aggregate.HyperLogLogPlusPlusDeclarativeRule$$$Lambda$3476/904161909 0x00000007c1914c28]
[Unloading class com.databricks.sql.optimizer.FilePruning$$$Lambda$3474/2055631693 0x00000007c190c428]
[Unloading class com.databricks.sql.optimizer.SkewJoinRewrite$$Lambda$3472/1837389411 0x00000007c190bc28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.ReplaceUpdateFieldsExpression$$$Lambda$3464/1286444097 0x00000007c1907028]
[Unloading class org.apache.spark.sql.catalyst.optimizer.ReplaceUpdateFieldsExpression$$$Lambda$3462/459179140 0x00000007c1907428]
[Unloading class org.apache.spark.sql.catalyst.optimizer.NormalizeFloatingNumbers$$$Lambda$3461/1612545051 0x00000007c1906828]
[Unloading class com.databricks.sql.optimizer.ComplexTypeMinMax$$$Lambda$3459/1984219374 0x00000007c1906028]
[Unloading class com.databricks.sql.optimizer.ReduceSubstringMaterialization$$$Lambda$3456/208321590 0x00000007c1905428]
[Unloading class com.databricks.sql.optimizer.PushLimitIntoAggregate$$$Lambda$3454/424116817 0x00000007c1904c28]
[Unloading class com.databricks.sql.optimizer.CollapseMultipleWindowSpec$$$Lambda$3453/1203916231 0x00000007c1904428]
[Unloading class com.databricks.sql.optimizer.CollapseMultipleWindowSpec$$$Lambda$3451/896822183 0x00000007c1904828]
[Unloading class com.databricks.sql.optimizer.PushWindowSpecAndInputsIntoProject$$$Lambda$3450/1809733370 0x00000007c1903828]
[Unloading class com.databricks.sql.optimizer.ConvertInnerToSemiJoins$$$Lambda$3448/1056990693 0x00000007c1903428]
[Unloading class com.databricks.sql.optimizer.ConvertInnerToSemiJoins$$$Lambda$3446/448302194 0x00000007c1902c28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.RewritePredicateSubquery$$Lambda$3444/889398880 0x00000007c1902428]
[Unloading class com.databricks.sql.optimizer.PropagateEmptyRelationThroughAggregate$$$Lambda$3442/460257192 0x00000007c1901828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.CombineTypedFilters$$$Lambda$3438/1288608872 0x00000007c1900c28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.CombineTypedFilters$$$Lambda$3437/146005805 0x00000007c1900828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.RewriteDistinctAggregates$$$Lambda$3434/143749295 0x00000007c18ffc28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.RewriteDistinctAggregates$$$Lambda$3433/1435617982 0x00000007c18ff828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.DecimalAggregates$$$Lambda$3431/1661594007 0x00000007c18ff028]
[Unloading class org.apache.spark.sql.catalyst.optimizer.DecimalAggregates$$$Lambda$3430/534555437 0x00000007c18fec28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.EliminateSorts$$$Lambda$3428/678518605 0x00000007c18fe428]
[Unloading class org.apache.spark.sql.catalyst.optimizer.EliminateSorts$$$Lambda$3427/1720048667 0x00000007c18fe028]
[Unloading class org.apache.spark.sql.catalyst.optimizer.CostBasedJoinReorder$$$Lambda$3425/934251428 0x00000007c18fd828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.CostBasedJoinReorder$$$Lambda$3423/2072120879 0x00000007c18fd028]
[Unloading class com.databricks.sql.optimizer.JoinORExpansion$$$Lambda$3421/436304206 0x00000007c18fc828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.PushExtraPredicateThroughJoin$$$Lambda$3419/703789762 0x00000007c18f4028]
[Unloading class org.apache.spark.sql.catalyst.optimizer.PushExtraPredicateThroughJoin$$$Lambda$3417/36867698 0x00000007c18f3828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.InferFiltersFromGenerate$$$Lambda$3415/1052433543 0x00000007c18f3028]
[Unloading class org.apache.spark.sql.catalyst.optimizer.InferFiltersFromGenerate$$$Lambda$3414/1096480264 0x00000007c18f2c28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.CombineConcats$$$Lambda$3412/91277557 0x00000007c18f2428]
[Unloading class org.apache.spark.sql.catalyst.optimizer.CombineConcats$$$Lambda$3411/1676886678 0x00000007c18f1c28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.OptimizeCsvJsonExprs$$$Lambda$3409/1108443449 0x00000007c18f1828]
[Unloading class com.databricks.sql.optimizer.CombineApproximatePercentileAggregates$$$Lambda$3407/1080544582 0x00000007c18f1028]
[Unloading class com.databricks.sql.optimizer.CombineApproximatePercentileAggregates$$$Lambda$3406/265661395 0x00000007c18f0c28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.RemoveRedundantAggregates$$$Lambda$3404/825503174 0x00000007c18f0028]
[Unloading class org.apache.spark.sql.catalyst.optimizer.RemoveRedundantAggregates$$$Lambda$3403/774040617 0x00000007c18f0428]
[Unloading class com.databricks.sql.optimizer.ConvertRedundantAggregateToProject$$$Lambda$3401/1798164199 0x00000007c18ef828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.UnwrapCastInBinaryComparison$$$Lambda$3398/493417849 0x00000007c18eec28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.UnwrapCastInBinaryComparison$$$Lambda$3396/202908194 0x00000007c18ee428]
[Unloading class org.apache.spark.sql.catalyst.optimizer.EliminateSerialization$$$Lambda$3395/2097955100 0x00000007c18ee028]
[Unloading class org.apache.spark.sql.catalyst.optimizer.EliminateSerialization$$$Lambda$3393/621784350 0x00000007c18ed828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.RewriteLateralSubquery$$$Lambda$3392/1495474911 0x00000007c18ed428]
[Unloading class org.apache.spark.sql.catalyst.optimizer.RewriteLateralSubquery$$$Lambda$3390/714758873 0x00000007c18ecc28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.SimplifyCaseConversionExpressions$$$Lambda$3388/1245338317 0x00000007c18ec828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.PruneFilters$$$Lambda$3386/526286856 0x00000007c18ebc28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.PruneFilters$$$Lambda$3385/182510575 0x00000007c18eb828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.SimplifyConditionalsInPredicate$$$Lambda$3383/1293671630 0x00000007c18eb028]
[Unloading class org.apache.spark.sql.catalyst.optimizer.SimplifyConditionalsInPredicate$$$Lambda$3381/64360078 0x00000007c18ea828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.SimplifyBinaryComparison$$$Lambda$3379/1865371724 0x00000007c18ea028]
[Unloading class org.apache.spark.sql.catalyst.optimizer.SimplifyBinaryComparison$$$Lambda$3377/744246854 0x00000007c18e9828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.RemoveDispensableExpressions$$$Lambda$3376/1657260643 0x00000007c18e1428]
[Unloading class com.databricks.sql.optimizer.FilterPredicateSimplification$$$Lambda$3370/1028041830 0x00000007c18dfc28]
[Unloading class com.databricks.sql.optimizer.SimplifyRegExpReplace$$$Lambda$3362/137278931 0x00000007c18ddc28]
[Unloading class com.databricks.sql.optimizer.PushFoldableIntoBranches$$$Lambda$3358/876478970 0x00000007c18dcc28]
[Unloading class com.databricks.sql.optimizer.PushFoldableIntoBranches$$$Lambda$3356/1983993346 0x00000007c18dc428]
[Unloading class org.apache.spark.sql.catalyst.optimizer.OptimizeIn$$$Lambda$3354/879358365 0x00000007c18dbc28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.FoldablePropagation$$$Lambda$3351/1424613691 0x00000007c18db028]
[Unloading class org.apache.spark.sql.catalyst.optimizer.NullPropagation$$$Lambda$3349/187747868 0x00000007c18da828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.OptimizeRepartition$$$Lambda$3346/2084988313 0x00000007c18d9c28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.CollapseProject$$$Lambda$3343/344287895 0x00000007c18d9028]
[Unloading class org.apache.spark.sql.catalyst.optimizer.CollapseRepartition$$$Lambda$3340/1835055844 0x00000007c18d8428]
[Unloading class com.databricks.sql.optimizer.PushDownGetArrayItemToStringSplit$$$Lambda$3337/1502326358 0x00000007c18cfc28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.PushDownPredicates$$$Lambda$3331/1818774174 0x00000007c18ce028]
[Unloading class com.databricks.sql.optimizer.OptimizeCase$$$Lambda$3328/96534334 0x00000007c18cd428]
[Unloading class org.apache.spark.sql.catalyst.optimizer.EliminateOuterJoin$$$Lambda$3325/672365198 0x00000007c18cc828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.ReorderJoin$$$Lambda$3323/1371787515 0x00000007c18cc428]
[Unloading class org.apache.spark.sql.catalyst.optimizer.PushProjectionThroughUnion$$$Lambda$3321/331538239 0x00000007c18cbc28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.ReplaceDistinctWithAggregate$$$Lambda$3318/466112302 0x00000007c18cac28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.RewriteExceptAll$$$Lambda$3312/1147533104 0x00000007c18c9428]
[Unloading class org.apache.spark.sql.catalyst.optimizer.Optimizer$OptimizeSubqueries$$$Lambda$3310/1598298257 0x00000007c18c8c28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.OptimizeOneRowRelationSubquery$$anonfun$rewrite$1$$Lambda$3305/611027380 0x00000007c18bf828]
[Unloading class org.apache.spark.sql.catalyst.optimizer.PropagateEmptyRelation$$$Lambda$3301/1143991112 0x00000007c18bec28]
[Unloading class org.apache.spark.sql.catalyst.optimizer.OptimizeLimitZero$$$Lambda$3299/1176438250 0x00000007c18be028]
[Unloading class org.apache.spark.sql.catalyst.optimizer.RemoveNoopUnion$$$Lambda$3297/1527650423 0x00000007c18bd828]
[Unloading class org.apache.spark.sql.catalyst.plans.QueryPlan$$Lambda$3293/696698780 0x00000007c18bc428]
[Unloading class org.apache.spark.sql.catalyst.plans.QueryPlan$$Lambda$3290/1597408584 0x00000007c18bc028]
[Unloading class org.apache.spark.sql.catalyst.optimizer.PullOutGroupingExpressions$$$Lambda$3280/1467728490 0x00000007c18b1428]
[Unloading class com.databricks.sql.transaction.tahoe.stats.PrepareDeltaScan$$Lambda$3275/1568286128 0x00000007c18a0028]
[Unloading class org.apache.spark.sql.SparkSessionExtensions$$Lambda$3232/861578318 0x00000007c1874828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c17a6028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c178c828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c177b828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c173fc28]
[Unloading class org.apache.derby.exe.ac109f00b9x017dx7f1cxda47x0001a0b80ab02 0x00000007c1726428]
[Unloading class org.apache.derby.exe.ac109f00b9x017dx7f1cxda47x0001a0b80ab01 0x00000007c171dc28]
[Unloading class org.apache.derby.exe.ac109f00b9x017dx7f1cxda47x0001a0b80ab00 0x00000007c170d428]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1704c28]
[Unloading class org.apache.derby.exe.acd236c30ex017dx7f1cxda47x0001a0b80ab00 0x00000007c16f3c28]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c16eb428]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c16d2428]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c16c9c28]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c16a0428]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1697c28]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c168f428]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1686c28]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c167e428]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1675c28]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c166d428]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1664c28]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1653828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c164b028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1642828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c163a028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1631828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1629028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1620828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1618028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c160f828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1607028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c15fe828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c15f6028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c15ed828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c15e5028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c15dc828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c15d4028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c15cb828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c15c3028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c15ba828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c15b2028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c15a9828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c15a1028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1598828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1590028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1587828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c157f028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1576828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c156dc28]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1565428]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c155c028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1553828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c154b028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1542828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1539c28]
[Unloading class org.apache.derby.exe.acf04340b7x017dx7f1cxda47x0001a0b80ab00 0x00000007c1530028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1517828]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c150ec28]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c1506028]
[Unloading class org.apache.derby.exe.ac30fbc0bbx017dx7f1cxda47x0001a0b80ab00 0x00000007c14ed028]
[Unloading class org.apache.derby.exe.aca65c80acx017dx7f1cxda47x0001a0b80ab00 0x00000007c1484028]
[PSYoungGen: 140265K->0K(2142208K)] [ParOldGen: 330212K->261770K(4952064K)] 470478K->261770K(7094272K), [Metaspace: 243748K->239604K(1286144K)], 1.2334019 secs] [Times: user=2.18 sys=0.00, real=1.23 secs]
[Loaded org.apache.spark.sql.catalyst.expressions.ExpressionSet from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.expressions.ExpressionSet$$$Lambda$6924/44026644 from org.apache.spark.sql.catalyst.expressions.ExpressionSet$]
[Loaded org.apache.spark.ContextCleaner$$Lambda$6925/1914002168 from org.apache.spark.ContextCleaner]
[Loaded com.databricks.sql.optimizer.ConvertInnerToSemiJoins$$$Lambda$6926/1877539454 from com.databricks.sql.optimizer.ConvertInnerToSemiJoins$]
[Loaded org.apache.spark.broadcast.TorrentBroadcast$$$Lambda$6927/915879741 from org.apache.spark.broadcast.TorrentBroadcast$]
[Loaded com.databricks.sql.expressions.OptimizeMultiSemiStructuredExtract$SemiStructuredExtractions$$$Lambda$6928/1442048439 from com.databricks.sql.expressions.OptimizeMultiSemiStructuredExtract$SemiStructuredExtractions$]
[Loaded org.apache.spark.storage.BlockManagerMessages$ToBlockManagerMasterStorageEndpoint from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManagerMessages$RemoveBroadcast from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManagerMessages$GetBlockStatus from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManagerMessages$GetShufflePushMergerLocations from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManagerMessages$RemoveShufflePushMergerLocation from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManagerMessages$IsExecutorAlive from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManagerMessages$GetMatchingBlockIds from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManagerMessages$RemoveRdd from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.expressions.OptimizeMultiSemiStructuredExtract$SemiStructuredExtractions$$anonfun$$nestedInanonfun$unapply$1$1 from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManagerMessages$RemoveShuffle from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManagerMasterEndpoint$$Lambda$6929/599879923 from org.apache.spark.storage.BlockManagerMasterEndpoint]
[Loaded org.apache.spark.sql.catalyst.expressions.SemiStructuredExtractGetJsonObject from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManagerMasterEndpoint$$Lambda$6930/787565800 from org.apache.spark.storage.BlockManagerMasterEndpoint]
[Loaded org.apache.spark.storage.BlockManagerStorageEndpoint$$anonfun$receiveAndReply$1 from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$handleBlockRemovalFailure$1 from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManagerMessages$RemoveBlock from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded scala.concurrent.Future$$$Lambda$6931/1879103973 from scala.concurrent.Future$]
[Loaded scala.concurrent.Future$$$Lambda$6932/183558283 from scala.concurrent.Future$]
[Loaded org.apache.spark.storage.BlockManagerMessages$DecommissionBlockManager$ from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManagerStorageEndpoint$$anonfun$receiveAndReply$1$$Lambda$6933/99260713 from org.apache.spark.storage.BlockManagerStorageEndpoint$$anonfun$receiveAndReply$1]
[Loaded scala.concurrent.Future$$Lambda$6934/767323661 from scala.concurrent.Future]
[Loaded org.apache.spark.storage.BlockManagerStorageEndpoint$$Lambda$6935/350969903 from org.apache.spark.storage.BlockManagerStorageEndpoint]
[Loaded scala.concurrent.Future$$Lambda$6936/1803838684 from scala.concurrent.Future]
[Loaded scala.concurrent.Future$$$Lambda$6937/490555320 from scala.concurrent.Future$]
[Loaded org.apache.spark.sql.execution.datasources.v2.V1ScanWrapper from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.connector.read.LocalScan from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceV2Strategy$$Lambda$6938/90060715 from org.apache.spark.sql.execution.datasources.v2.DataSourceV2Strategy]
[Loaded org.apache.spark.storage.BlockManagerStorageEndpoint$$Lambda$6939/1879205396 from org.apache.spark.storage.BlockManagerStorageEndpoint]
[Loaded scala.concurrent.Future$$Lambda$6940/1059559971 from scala.concurrent.Future]
[Loaded scala.concurrent.Future$$Lambda$6941/2131369548 from scala.concurrent.Future]
[Loaded org.apache.spark.storage.BlockManagerStorageEndpoint$$Lambda$6942/2042018169 from org.apache.spark.storage.BlockManagerStorageEndpoint]
[Loaded org.apache.spark.storage.BlockManagerMaster$$Lambda$6943/39124281 from org.apache.spark.storage.BlockManagerMaster]
[Loaded org.apache.spark.storage.BlockManagerStorageEndpoint$$Lambda$6944/1986700316 from org.apache.spark.storage.BlockManagerStorageEndpoint]
[Loaded org.apache.spark.ContextCleaner$$Lambda$6946/447198804 from org.apache.spark.ContextCleaner]
[Loaded org.apache.spark.ContextCleaner$$Lambda$6947/149367740 from org.apache.spark.ContextCleaner]
[Loaded org.apache.spark.storage.BlockManager$$Lambda$6945/2024625313 from org.apache.spark.storage.BlockManager]
[Loaded org.apache.spark.storage.BlockManager$$Lambda$6948/2089996140 from org.apache.spark.storage.BlockManager]
[Loaded org.apache.spark.storage.BlockManager$$anonfun$1 from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockManager$$Lambda$6949/140705874 from org.apache.spark.storage.BlockManager]
[Loaded org.apache.spark.storage.BlockManager$$Lambda$6950/1658865565 from org.apache.spark.storage.BlockManager]
[Loaded org.apache.spark.util.io.ChunkedByteBuffer$$Lambda$6951/1505731029 from org.apache.spark.util.io.ChunkedByteBuffer]
[Loaded org.apache.spark.storage.memory.MemoryStore$$Lambda$6952/153725909 from org.apache.spark.storage.memory.MemoryStore]
[Loaded org.apache.spark.storage.BlockInfoManager$$Lambda$6953/425220914 from org.apache.spark.storage.BlockInfoManager]
[Loaded org.apache.spark.storage.BlockManagerInfo$$Lambda$6954/281286700 from org.apache.spark.storage.BlockManagerInfo]
[Loaded org.apache.spark.storage.BlockManagerInfo$$Lambda$6955/1590083647 from org.apache.spark.storage.BlockManagerInfo]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.datasources.v2.BatchScanExec from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.BlockStatus$ from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceRDD from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.storage.memory.MemoryStore$$Lambda$6956/432206755 from org.apache.spark.storage.memory.MemoryStore]
[Loaded com.couchbase.spark.query.QueryBatch from file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar]
[Loaded org.apache.spark.sql.connector.read.PartitionReaderFactory from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.couchbase.spark.query.QueryBatch$$Lambda$6957/550239961 from com.couchbase.spark.query.QueryBatch]
[Loaded com.couchbase.spark.query.QueryBatch$$Lambda$6958/1124901406 from com.couchbase.spark.query.QueryBatch]
[Loaded org.apache.spark.sql.connector.read.InputPartition from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.couchbase.spark.query.QueryInputPartition from file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar]
[Loaded org.apache.spark.storage.BlockManagerStorageEndpoint$$Lambda$6959/1575414690 from org.apache.spark.storage.BlockManagerStorageEndpoint]
[Loaded com.couchbase.spark.query.QueryPartitionReaderFactory from file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar]
[Loaded org.apache.spark.sql.connector.read.PartitionReader from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase$$Lambda$6960/2113868139 from org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase$$Lambda$6961/1758379564 from org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase$$Lambda$6962/1388939574 from org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase]
[Loaded org.apache.spark.storage.BlockManagerStorageEndpoint$$Lambda$6964/1461616850 from org.apache.spark.storage.BlockManagerStorageEndpoint]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase$$Lambda$6963/1579647390 from org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceV2Strategy$$Lambda$6965/1190415856 from org.apache.spark.sql.execution.datasources.v2.DataSourceV2Strategy]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceV2Strategy$$Lambda$6966/1491794977 from org.apache.spark.sql.execution.datasources.v2.DataSourceV2Strategy]
[Loaded org.apache.spark.sql.execution.ProjectExec$$Lambda$6967/975420883 from org.apache.spark.sql.execution.ProjectExec]
[Loaded org.apache.spark.sql.execution.RemoveRedundantProjects$$$Lambda$6968/104201841 from org.apache.spark.sql.execution.RemoveRedundantProjects$]
[Loaded org.apache.spark.sql.execution.RemoveRedundantProjects$$$Lambda$6969/1705174949 from org.apache.spark.sql.execution.RemoveRedundantProjects$]
[Loaded org.apache.spark.sql.catalyst.expressions.ExpressionProjection from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.expressions.SortOrder$$$Lambda$6970/680731074 from org.apache.spark.sql.catalyst.expressions.SortOrder$]
[Loaded org.apache.spark.sql.execution.TakeOrderedAndProjectExec$$Lambda$6971/717651918 from org.apache.spark.sql.execution.TakeOrderedAndProjectExec]
[Loaded org.apache.spark.sql.execution.CollapseCodegenStages$$Lambda$6972/2081873782 from org.apache.spark.sql.execution.CollapseCodegenStages]
[Loaded org.apache.spark.sql.execution.CollapseCodegenStages$$Lambda$6973/841434847 from org.apache.spark.sql.execution.CollapseCodegenStages]
[Loaded org.apache.spark.sql.execution.WholeStageCodegenExec$ from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.WholeStageCodegenExec$$$Lambda$6974/1486159860 from org.apache.spark.sql.execution.WholeStageCodegenExec$]
[Loaded org.apache.spark.sql.execution.CollapseCodegenStages$$Lambda$6975/69247026 from org.apache.spark.sql.execution.CollapseCodegenStages]
[Loaded org.apache.spark.sql.execution.CollapseCodegenStages$$Lambda$6976/1209556981 from org.apache.spark.sql.execution.CollapseCodegenStages]
[Loaded org.apache.spark.sql.catalyst.trees.UnaryLike$$Lambda$6977/1929110920 from org.apache.spark.sql.catalyst.trees.UnaryLike]
[Loaded org.apache.spark.sql.internal.connector.SupportsMetadata from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.optimizer.statsEstimation.ProjectEstimation$ from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.connector.read.SupportsReportStatistics from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.optimizer.statsEstimation.EstimationUtils$$$Lambda$6978/1198895637 from com.databricks.sql.optimizer.statsEstimation.EstimationUtils$]
[Loaded com.databricks.sql.optimizer.statsEstimation.DatabricksBasicStatsPlanVisitor$$$Lambda$6979/856458834 from com.databricks.sql.optimizer.statsEstimation.DatabricksBasicStatsPlanVisitor$]
[Loaded com.databricks.sql.optimizer.statsEstimation.EstimationUtils$$$Lambda$6980/94118182 from com.databricks.sql.optimizer.statsEstimation.EstimationUtils$]
[Loaded scala.collection.TraversableOnce$$Lambda$6981/1322159080 from scala.collection.TraversableOnce]
[Loaded com.databricks.sql.optimizer.statsEstimation.EstimationUtils$$$Lambda$6982/608271678 from com.databricks.sql.optimizer.statsEstimation.EstimationUtils$]
[Loaded com.databricks.sql.optimizer.statsEstimation.EstimationUtils$$$Lambda$6983/1932022258 from com.databricks.sql.optimizer.statsEstimation.EstimationUtils$]
[Loaded org.apache.spark.sql.connector.metric.CustomMetric from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase$$Lambda$6984/1399733219 from org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase]
[Loaded org.apache.spark.sql.execution.ui.SparkPlanGraph$$$Lambda$6985/1991627243 from org.apache.spark.sql.execution.ui.SparkPlanGraph$]
[Loaded org.apache.spark.sql.execution.ui.SparkPlanGraphNode$ from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.ui.SparkPlanGraphClusterWrapper from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.collect.Collector$$$Lambda$6986/2083505248 from org.apache.spark.sql.execution.collect.Collector$]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.LazilyGeneratedOrdering from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$ from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$$$Lambda$6987/1959631569 from org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$$$Lambda$6988/1285229621 from org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$$$Lambda$6989/2053681 from org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$$$Lambda$6990/148000428 from org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$$$Lambda$6991/282154613 from org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$]
[Loaded scala.math.Ordering$ExtraImplicits from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded scala.math.Ordering$Implicits$ from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded scala.math.Ordering$IterableOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded scala.math.Ordering$OptionOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded scala.math.Ordering$Tuple3Ordering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded scala.math.Ordering$Tuple5Ordering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded scala.math.Ordering$Tuple6Ordering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded scala.math.Ordering$Tuple7Ordering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded scala.math.Ordering$Tuple8Ordering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded scala.math.Ordering$Tuple9Ordering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded scala.math.Ordering$UnitOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded scala.math.Ordering$Unit$ from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded java.util.function.ToDoubleFunction from /usr/lib/jvm/zulu8-ca-amd64/jre/lib/rt.jar]
[Loaded org.codehaus.janino.UnitCompiler$20 from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.codehaus.janino--janino--org.codehaus.janino__janino__3.0.16.jar]
[Loaded org.codehaus.janino.util.iterator.FilterListIterator from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.codehaus.janino--janino--org.codehaus.janino__janino__3.0.16.jar]
[Loaded org.codehaus.janino.util.iterator.ReverseListIterator from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.codehaus.janino--janino--org.codehaus.janino__janino__3.0.16.jar]
[Loaded org.apache.spark.sql.catalyst.expressions.GeneratedClass from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.codehaus.janino--janino--org.codehaus.janino__janino__3.0.16.jar]
[Loaded org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.codehaus.janino--janino--org.codehaus.janino__janino__3.0.16.jar]
[Loaded org.apache.spark.sql.execution.CodegenSupport$$Lambda$6992/1054909027 from org.apache.spark.sql.execution.CodegenSupport]
[Loaded org.apache.spark.sql.execution.CodegenSupport$$Lambda$6993/1038089571 from org.apache.spark.sql.execution.CodegenSupport]
[Loaded org.apache.spark.sql.execution.InputRDDCodegen$$Lambda$6994/2024744037 from org.apache.spark.sql.execution.InputRDDCodegen]
[Loaded org.apache.spark.sql.execution.CodegenSupport$$Lambda$6995/1139237545 from org.apache.spark.sql.execution.CodegenSupport]
[Loaded org.apache.spark.sql.execution.CodegenSupport$$Lambda$6996/17935187 from org.apache.spark.sql.execution.CodegenSupport]
[Loaded org.apache.spark.sql.execution.ProjectExec$$Lambda$6997/105518209 from org.apache.spark.sql.execution.ProjectExec]
[Loaded org.apache.spark.sql.execution.ProjectExec$$anonfun$$nestedInanonfun$usedInputs$1$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.ProjectExec$$Lambda$6998/1930050499 from org.apache.spark.sql.execution.ProjectExec]
[Loaded org.apache.spark.sql.execution.ProjectExec$$Lambda$6999/624088895 from org.apache.spark.sql.execution.ProjectExec]
[Loaded org.apache.spark.sql.execution.ProjectExec$$Lambda$7000/510638388 from org.apache.spark.sql.execution.ProjectExec]
[Loaded org.apache.spark.sql.execution.CodegenSupport$$Lambda$7001/198547775 from org.apache.spark.sql.execution.CodegenSupport]
[Loaded org.apache.spark.sql.execution.CodegenSupport$$Lambda$7002/698580901 from org.apache.spark.sql.execution.CodegenSupport]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$$Lambda$7003/446422657 from org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext$$Lambda$7004/2044621567 from org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext$$Lambda$7005/1592066202 from org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext$$Lambda$7006/1820918452 from org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext$$Lambda$7007/681752178 from org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.SubExprCodes from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext$$Lambda$7008/1884189085 from org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext$$Lambda$7009/1298290880 from org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext$$Lambda$7010/1680492695 from org.apache.spark.sql.catalyst.expressions.codegen.CodegenContext]
[Loaded org.apache.spark.sql.execution.ProjectExec$$Lambda$7011/34793466 from org.apache.spark.sql.execution.ProjectExec]
[Loaded org.apache.spark.sql.execution.ProjectExec$$Lambda$7012/22265020 from org.apache.spark.sql.execution.ProjectExec]
[Loaded org.apache.spark.sql.execution.CodegenSupport$$Lambda$7013/1579700238 from org.apache.spark.sql.execution.CodegenSupport]
[Loaded org.apache.spark.sql.execution.CodegenSupport$$Lambda$7014/2110313071 from org.apache.spark.sql.execution.CodegenSupport]
[Loaded org.apache.spark.sql.execution.CodegenSupport$$Lambda$7015/1633822518 from org.apache.spark.sql.execution.CodegenSupport]
[Loaded org.apache.spark.sql.execution.CodegenSupport$$Lambda$7016/2111222580 from org.apache.spark.sql.execution.CodegenSupport]
[Loaded org.apache.spark.sql.execution.CodegenSupport$$Lambda$7017/325407879 from org.apache.spark.sql.execution.CodegenSupport]
[Loaded org.apache.spark.sql.execution.CodegenSupport$$Lambda$7018/679847726 from org.apache.spark.sql.execution.CodegenSupport]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.NewFunctionSpec from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.WholeStageCodegenExec$$Lambda$7019/701597267 from org.apache.spark.sql.execution.WholeStageCodegenExec]
[Loaded org.apache.spark.sql.execution.WholeStageCodegenExec$$Lambda$7020/753461392 from org.apache.spark.sql.execution.WholeStageCodegenExec]
[Loaded org.apache.spark.sql.execution.BufferedRowIterator from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.CodeFormatter$$$Lambda$7021/2070082050 from org.apache.spark.sql.catalyst.expressions.codegen.CodeFormatter$]
[Loaded org.apache.spark.sql.execution.WholeStageCodegenExec$$Lambda$7022/270875030 from org.apache.spark.sql.execution.WholeStageCodegenExec]
[Loaded org.codehaus.janino.ReflectionIClass$ReflectionIField from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.codehaus.janino--janino--org.codehaus.janino__janino__3.0.16.jar]
[Loaded org.apache.spark.sql.execution.datasources.v2.BatchScanExec$$Lambda$7023/512659337 from org.apache.spark.sql.execution.datasources.v2.BatchScanExec]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase$$Lambda$7024/427644694 from org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor88 from __JVM_DefineClass__]
[Loaded org.apache.spark.sql.execution.WholeStageCodegenExec$$Lambda$7025/1655318727 from org.apache.spark.sql.execution.WholeStageCodegenExec]
[Loaded org.apache.spark.sql.execution.WholeStageCodegenExec$$Lambda$7026/803968262 from org.apache.spark.sql.execution.WholeStageCodegenExec]
[Loaded org.apache.spark.sql.execution.TakeOrderedAndProjectExec$$Lambda$7027/158811064 from org.apache.spark.sql.execution.TakeOrderedAndProjectExec]
[Loaded org.apache.spark.rdd.RDD$$Lambda$7028/605101707 from org.apache.spark.rdd.RDD]
[Loaded org.apache.spark.rdd.RDD$$Lambda$7029/399439582 from org.apache.spark.rdd.RDD]
[Loaded org.apache.spark.rdd.RDD$$Lambda$7030/1685979780 from org.apache.spark.rdd.RDD]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceRDDPartition from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceRDD$$Lambda$7031/1744706770 from org.apache.spark.sql.execution.datasources.v2.DataSourceRDD]
[Loaded org.apache.spark.rdd.RDD$$Lambda$7032/1852377266 from org.apache.spark.rdd.RDD]
[Loaded org.apache.spark.rdd.RDD$$Lambda$7033/1153978652 from org.apache.spark.rdd.RDD]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor89 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor90 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor91 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor92 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor93 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor94 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor95 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor96 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor97 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor98 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor99 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor100 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor101 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor102 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor103 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor104 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor105 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor106 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor107 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor108 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor109 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor110 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor111 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor112 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor113 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor114 from __JVM_DefineClass__]
[Loaded com.databricks.sql.advice.AdvisorListener$$Lambda$7034/1084305756 from com.databricks.sql.advice.AdvisorListener]
[Loaded org.apache.spark.sql.execution.ui.SQLAppStatusListener$$Lambda$7035/506438571 from org.apache.spark.sql.execution.ui.SQLAppStatusListener]
[Loaded org.apache.spark.scheduler.DAGScheduler$$Lambda$7036/693431874 from org.apache.spark.scheduler.DAGScheduler]
[Loaded org.apache.spark.sql.execution.ui.SQLAppStatusListener$$Lambda$7037/795009656 from org.apache.spark.sql.execution.ui.SQLAppStatusListener]
[Loaded org.apache.spark.sql.execution.ui.SQLAppStatusListener$$Lambda$7038/1405200017 from org.apache.spark.sql.execution.ui.SQLAppStatusListener]
[Loaded org.apache.spark.sql.execution.ui.SQLAppStatusListener$$Lambda$7039/438519396 from org.apache.spark.sql.execution.ui.SQLAppStatusListener]
[Loaded org.apache.spark.scheduler.WildcardLocation$ from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded java.lang.invoke.LambdaForm$DMH/669587785 from java.lang.invoke.LambdaForm]
[Loaded com.databricks.sql.debugger.QueryWatchdogListener$$Lambda$7040/1149264809 from com.databricks.sql.debugger.QueryWatchdogListener]
[Loaded org.apache.spark.scheduler.TaskLocation$ from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.util.collection.OpenHashMap$mcI$sp from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.scheduler.HostTaskLocation from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.util.collection.OpenHashSet$$Lambda$7041/521143196 from org.apache.spark.util.collection.OpenHashSet]
[Loaded org.apache.spark.util.collection.OpenHashSet$$Lambda$7042/520119932 from org.apache.spark.util.collection.OpenHashSet]
[Loaded org.apache.spark.util.collection.OpenHashSet$$Lambda$7043/1929144544 from org.apache.spark.util.collection.OpenHashSet]
[Loaded org.apache.spark.util.collection.OpenHashSet$$Lambda$7044/1842883915 from org.apache.spark.util.collection.OpenHashSet]
[Loaded sun.reflect.GeneratedMethodAccessor157 from __JVM_DefineClass__]
[Loaded org.apache.spark.util.collection.OpenHashMap$mcI$sp$$Lambda$7045/1925963663 from org.apache.spark.util.collection.OpenHashMap$mcI$sp]
[Loaded java.lang.invoke.LambdaForm$DMH/1852077100 from java.lang.invoke.LambdaForm]
[Loaded org.apache.spark.util.collection.OpenHashMap$mcI$sp$$Lambda$7046/2135861864 from org.apache.spark.util.collection.OpenHashMap$mcI$sp]
[Loaded org.apache.spark.sql.execution.ui.SQLAppStatusListener$$Lambda$7047/1406169791 from org.apache.spark.sql.execution.ui.SQLAppStatusListener]
[Loaded org.apache.spark.sql.execution.ui.SQLAppStatusListener$$Lambda$7048/2135685357 from org.apache.spark.sql.execution.ui.SQLAppStatusListener]
[Loaded org.apache.spark.sql.execution.ui.SQLAppStatusListener$$Lambda$7049/851650108 from org.apache.spark.sql.execution.ui.SQLAppStatusListener]
[Loaded org.apache.spark.scheduler.ExecutorCacheTaskLocation from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.ui.SQLAppStatusListener$$Lambda$7050/945084390 from org.apache.spark.sql.execution.ui.SQLAppStatusListener]
[Loaded org.apache.spark.scheduler.HDFSCacheTaskLocation from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.scheduler.TaskSetManager$$Lambda$7051/1532318773 from org.apache.spark.scheduler.TaskSetManager]
[Loaded org.apache.spark.scheduler.TaskSetManager$$Lambda$7052/397382169 from org.apache.spark.scheduler.TaskSetManager]
[Loaded org.apache.spark.sql.execution.ui.StageAndAttemptId from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded scala.collection.mutable.HashMap$$anon$3 from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded org.apache.spark.scheduler.TaskSchedulerImpl$$Lambda$7053/1972241483 from org.apache.spark.scheduler.TaskSchedulerImpl]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor115 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor116 from __JVM_DefineClass__]
[Loaded com.databricks.sql.debugger.QueryWatchdogListener$TaskMetrics from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.json.JacksonGenerator from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded java.lang.invoke.LambdaForm$BMH/977072454 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1743339166 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1740351678 from java.lang.invoke.LambdaForm]
[Loaded org.apache.spark.sql.CouchbaseJsonUtils$$$Lambda$7054/320170088 from org.apache.spark.sql.CouchbaseJsonUtils$]
[Loaded sun.reflect.GeneratedConstructorAccessor87 from __JVM_DefineClass__]
[Loaded org.apache.spark.sql.vectorized.ColumnarBatch from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded java.lang.invoke.LambdaForm$BMH/1548444504 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1387284996 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1137156906 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/423457622 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/510366346 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1314589481 from java.lang.invoke.LambdaForm]
[Loaded java.lang.Class$EnclosingMethodInfo from /usr/lib/jvm/zulu8-ca-amd64/jre/lib/rt.jar]
[Loaded java.lang.invoke.LambdaForm$BMH/821767995 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1110330801 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1276092 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/377956820 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/2146197192 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1335871297 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1414697110 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/2075445825 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1714672097 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$MH/1470609321 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$MH/389106667 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$MH/1250585998 from java.lang.invoke.LambdaForm]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase$$Lambda$7055/852361703 from org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase]
[Loaded sun.reflect.GeneratedMethodAccessor158 from __JVM_DefineClass__]
[Loaded org.apache.spark.sql.execution.WholeStageCodegenExec$$Lambda$7056/1555195709 from org.apache.spark.sql.execution.WholeStageCodegenExec]
[Loaded java.lang.invoke.LambdaForm$MH/514954357 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$MH/834084089 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/410209124 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1463246040 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1161263223 from java.lang.invoke.LambdaForm]
[Loaded org.apache.spark.rdd.RDD$$Lambda$7057/1282340092 from org.apache.spark.rdd.RDD]
[Loaded org.apache.spark.sql.execution.TakeOrderedAndProjectExec$$Lambda$7058/1313128062 from org.apache.spark.sql.execution.TakeOrderedAndProjectExec]
[Loaded org.apache.spark.rdd.RDD$$Lambda$7059/595131122 from org.apache.spark.rdd.RDD]
[Loaded org.apache.spark.sql.catalyst.expressions.codegen.LazilyGeneratedOrdering$$Lambda$7060/1567808042 from org.apache.spark.sql.catalyst.expressions.codegen.LazilyGeneratedOrdering]
[Loaded java.lang.invoke.LambdaForm$DMH/1148879474 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/771789800 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/2125080914 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/616985051 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$MH/884940323 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/503989391 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/826609317 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/647686655 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1788392713 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/645577449 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1062447269 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1324837793 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1642639243 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/313982125 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/805468382 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1163769628 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1642974411 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$DMH/1151014863 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1736196874 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1337060782 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1085664007 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1552722353 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1508758841 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/426876727 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1987289805 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1334533267 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/2029526086 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1843768393 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/116068082 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/319923576 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/724795760 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1811216315 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/892790953 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/722641170 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/471698689 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1343813654 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1382625564 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1816985714 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/474541439 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/193425064 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/285036309 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/339186664 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$MH/1339012868 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$DMH/2139972219 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$MH/99753848 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$MH/762471677 from java.lang.invoke.LambdaForm]
[Loaded org.apache.spark.sql.catalyst.InternalRow$$$Lambda$7061/523731392 from org.apache.spark.sql.catalyst.InternalRow$]
[Loaded org.apache.spark.sql.catalyst.InternalRow$$$Lambda$7062/188589799 from org.apache.spark.sql.catalyst.InternalRow$]
[Loaded org.apache.spark.repl.ExecutorClassLoader$$Lambda$7063/1879145984 from org.apache.spark.repl.ExecutorClassLoader]
[Loaded org.apache.spark.rpc.netty.NettyStreamManager$$Lambda$7064/551020418 from org.apache.spark.rpc.netty.NettyStreamManager]
[Loaded org.apache.spark.network.protocol.StreamFailure from file:/databricks/jars/----workspace_spark_3_2--common--network-common--network-common-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.rpc.netty.NettyRpcEnv$FileDownloadCallback$$Lambda$7065/238406515 from org.apache.spark.rpc.netty.NettyRpcEnv$FileDownloadCallback]
[Loaded org.apache.spark.repl.ExecutorClassLoader$$anon$1$$Lambda$7066/110064648 from org.apache.spark.repl.ExecutorClassLoader$$anon$1]
[Loaded org.apache.spark.repl.ExecutorClassLoader$$Lambda$7067/128720347 from org.apache.spark.repl.ExecutorClassLoader]
[Loaded org.apache.spark.sql.catalyst.expressions.GeneratedClass from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.codehaus.janino--janino--org.codehaus.janino__janino__3.0.16.jar]
[Loaded org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.codehaus.janino--janino--org.codehaus.janino__janino__3.0.16.jar]
[Loaded org.apache.spark.rdd.RDD$$Lambda$7068/1212688340 from org.apache.spark.rdd.RDD]
[Loaded sun.invoke.util.ValueConversions$1 from /usr/lib/jvm/zulu8-ca-amd64/jre/lib/rt.jar]
[Loaded java.lang.invoke.LambdaForm$DMH/1896692421 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/66198020 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.BoundMethodHandle$Species_LI from __JVM_DefineClass__]
[Loaded java.lang.invoke.LambdaForm$BMH/839115869 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/708852682 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/963716796 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1657132153 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$BMH/1330242928 from java.lang.invoke.LambdaForm]
[Loaded com.couchbase.spark.query.QueryPartitionReader from file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar]
[Loaded com.couchbase.spark.query.QueryPartitionReader$$Lambda$7069/415298360 from com.couchbase.spark.query.QueryPartitionReader]
[Loaded com.couchbase.spark.query.QueryPartitionReader$$Lambda$7070/477174977 from com.couchbase.spark.query.QueryPartitionReader]
[Loaded org.apache.spark.sql.catalyst.json.JSONOptionsInRead$ from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded sun.nio.cs.UTF_32 from /usr/lib/jvm/zulu8-ca-amd64/jre/lib/rt.jar]
[Loaded org.apache.spark.sql.catalyst.StructFilters from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.json.JsonFilters from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.NoopFilters from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.util.PartialResultException from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.json.JacksonParser$ from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.json.JacksonParser$$Lambda$7071/861834616 from org.apache.spark.sql.catalyst.json.JacksonParser]
[Loaded org.apache.spark.sql.catalyst.json.JacksonParser$$Lambda$7072/1580743128 from org.apache.spark.sql.catalyst.json.JacksonParser]
[Loaded org.apache.spark.sql.catalyst.json.JacksonParser$$Lambda$7073/1999415819 from org.apache.spark.sql.catalyst.json.JacksonParser]
[Loaded org.apache.spark.sql.catalyst.json.JacksonParser$$Lambda$7074/215268395 from org.apache.spark.sql.catalyst.json.JacksonParser]
[Loaded org.apache.spark.sql.catalyst.json.JacksonParser$$Lambda$7075/1260836228 from org.apache.spark.sql.catalyst.json.JacksonParser]
[Loaded org.apache.spark.sql.catalyst.json.JacksonParser$$Lambda$7076/1414517697 from org.apache.spark.sql.catalyst.json.JacksonParser]
[Loaded org.apache.spark.sql.catalyst.StructFilters$ from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.StructFilters$$$Lambda$7077/1960943623 from org.apache.spark.sql.catalyst.StructFilters$]
[Loaded org.apache.spark.sql.catalyst.json.JsonFilters$JsonPredicate from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.json.JsonFilters$$Lambda$7078/1521668626 from org.apache.spark.sql.catalyst.json.JsonFilters]
[Loaded org.apache.spark.sql.catalyst.json.JsonFilters$$Lambda$7079/1282228671 from org.apache.spark.sql.catalyst.json.JsonFilters]
[Loaded org.apache.spark.sql.catalyst.json.JsonFilters$$Lambda$7080/1170859186 from org.apache.spark.sql.catalyst.json.JsonFilters]
[Loaded org.apache.spark.sql.catalyst.json.JsonFilters$$Lambda$7081/1988522073 from org.apache.spark.sql.catalyst.json.JsonFilters]
[Loaded org.apache.spark.sql.catalyst.json.JsonFilters$$Lambda$7082/988286681 from org.apache.spark.sql.catalyst.json.JsonFilters]
[Loaded org.apache.spark.sql.catalyst.json.JsonFilters$$Lambda$7083/472825986 from org.apache.spark.sql.catalyst.json.JsonFilters]
[Loaded org.apache.spark.sql.catalyst.json.JsonFilters$$Lambda$7084/1401550313 from org.apache.spark.sql.catalyst.json.JsonFilters]
[Loaded org.apache.spark.sql.catalyst.json.JacksonParser$$Lambda$7085/1135429864 from org.apache.spark.sql.catalyst.json.JacksonParser]
[Loaded org.apache.spark.sql.execution.datasources.v2.MetricsIterator from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.datasources.v2.MetricsRowIterator from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.datasources.v2.PartitionIterator from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.datasources.v2.MetricsHandler from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.deploy.SparkHadoopUtil$$Lambda$7086/1841045458 from org.apache.spark.deploy.SparkHadoopUtil]
[Loaded org.apache.spark.deploy.SparkHadoopUtil$$Lambda$7087/668248895 from org.apache.spark.deploy.SparkHadoopUtil]
[Loaded org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReference from file:/databricks/jars/----workspace_spark_3_2--third_party--hadoop-client-api--org.apache.hadoop__hadoop-client-api__3.3.1_shaded_shaded.jar]
[Loaded scala.Function0$mcJ$sp from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-library_2.12--org.scala-lang__scala-library__2.12.14.jar]
[Loaded org.apache.spark.deploy.SparkHadoopUtil$$anon$2 from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.datasources.v2.DataSourceRDD$$Lambda$7088/888769708 from org.apache.spark.sql.execution.datasources.v2.DataSourceRDD]
[Loaded org.apache.spark.TaskContext$$anon$1 from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.catalyst.expressions.GeneratedClass from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.codehaus.janino--janino--org.codehaus.janino__janino__3.0.16.jar]
[Loaded org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1 from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.codehaus.janino--janino--org.codehaus.janino__janino__3.0.16.jar]
[Loaded org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.TakeOrderedAndProjectExec$$Lambda$7089/522427650 from org.apache.spark.sql.execution.TakeOrderedAndProjectExec]
[Loaded org.apache.spark.util.collection.Utils$ from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.google.common.collect.Ordering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--com.google.guava--guava--com.google.guava__guava__15.0.jar]
[Loaded org.apache.spark.util.collection.Utils$$anon$1 from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.google.common.collect.ComparatorOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--com.google.guava--guava--com.google.guava__guava__15.0.jar]
[Loaded com.google.common.collect.ExplicitOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--com.google.guava--guava--com.google.guava__guava__15.0.jar]
[Loaded com.google.common.collect.CompoundOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--com.google.guava--guava--com.google.guava__guava__15.0.jar]
[Loaded com.google.common.collect.LexicographicalOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--com.google.guava--guava--com.google.guava__guava__15.0.jar]
[Loaded com.google.common.collect.ByFunctionOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--com.google.guava--guava--com.google.guava__guava__15.0.jar]
[Loaded com.google.common.collect.AllEqualOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--com.google.guava--guava--com.google.guava__guava__15.0.jar]
[Loaded com.google.common.collect.NaturalOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--com.google.guava--guava--com.google.guava__guava__15.0.jar]
[Loaded com.google.common.collect.UsingToStringOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--com.google.guava--guava--com.google.guava__guava__15.0.jar]
[Loaded com.google.common.collect.ReverseOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--com.google.guava--guava--com.google.guava__guava__15.0.jar]
[Loaded com.google.common.collect.NullsFirstOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--com.google.guava--guava--com.google.guava__guava__15.0.jar]
[Loaded com.google.common.collect.NullsLastOrdering from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--com.google.guava--guava--com.google.guava__guava__15.0.jar]
[Loaded com.couchbase.spark.query.QueryPartitionReader$$Lambda$7090/1624679747 from com.couchbase.spark.query.QueryPartitionReader]
[Loaded com.couchbase.spark.query.QueryPartitionReader$$Lambda$7091/453351700 from com.couchbase.spark.query.QueryPartitionReader]
[Loaded com.couchbase.spark.query.QueryPartitionReader$$Lambda$7092/572579477 from com.couchbase.spark.query.QueryPartitionReader]
[Loaded com.couchbase.spark.query.QueryPartitionReader$$Lambda$7093/240106949 from com.couchbase.spark.query.QueryPartitionReader]
[Loaded com.couchbase.spark.query.N1qlFilters$ from file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar]
[Loaded com.couchbase.spark.query.QueryPartitionReader$$Lambda$7094/1223470547 from com.couchbase.spark.query.QueryPartitionReader]
[Loaded com.couchbase.spark.query.QueryPartitionReader$$Lambda$7095/1002340510 from com.couchbase.spark.query.QueryPartitionReader]
[Loaded com.couchbase.client.scala.query.QueryScanConsistency$ConsistentWith from file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar]
[Loaded com.couchbase.client.scala.query.QueryScanConsistency$RequestPlus from file:/local_disk0/tmp/addedFile8152365715781121503acfaf781_0c8b_46b1_a8e5_dc60cc54663a_spark_connector_assembly_3_2_0_SNAPSHOT_91282-c839d.jar]
[Loaded com.couchbase.spark.query.QueryPartitionReader$$Lambda$7096/501961163 from com.couchbase.spark.query.QueryPartitionReader]
[Loaded com.couchbase.spark.query.QueryPartitionReader$$Lambda$7097/1641487441 from com.couchbase.spark.query.QueryPartitionReader]
[Loaded com.couchbase.spark.query.QueryPartitionReader$$Lambda$7098/2086559416 from com.couchbase.spark.query.QueryPartitionReader]
[Loaded org.apache.spark.sql.catalyst.json.JacksonParser$$Lambda$7099/771186633 from org.apache.spark.sql.catalyst.json.JacksonParser]
[Loaded org.apache.spark.sql.catalyst.json.JacksonParser$$Lambda$7100/1792353973 from org.apache.spark.sql.catalyst.json.JacksonParser]
[Loaded org.apache.spark.TaskContextImpl$$Lambda$7101/1025288555 from org.apache.spark.TaskContextImpl]
[Loaded org.apache.spark.sql.connector.metric.CustomTaskMetric from file:/databricks/jars/----workspace_spark_3_2--sql--catalyst--catalyst-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.metric.CustomMetrics$$$Lambda$7102/1896110884 from org.apache.spark.sql.execution.metric.CustomMetrics$]
[Loaded org.apache.spark.scheduler.Task$$Lambda$7103/379161818 from org.apache.spark.scheduler.Task]
[Loaded org.apache.spark.util.CausedBy$ from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.util.CausedBy$$$Lambda$7104/985296588 from org.apache.spark.util.CausedBy$]
[Loaded org.apache.spark.util.CausedBy$$$Lambda$7105/1416863702 from org.apache.spark.util.CausedBy$]
[Loaded org.apache.spark.executor.Executor$TaskRunner$$Lambda$7106/1825656573 from org.apache.spark.executor.Executor$TaskRunner]
[Loaded org.apache.spark.executor.Executor$TaskRunner$$Lambda$7107/897198084 from org.apache.spark.executor.Executor$TaskRunner]
[Loaded org.apache.spark.executor.Executor$TaskRunner$$Lambda$7108/360315643 from org.apache.spark.executor.Executor$TaskRunner]
[Loaded org.apache.spark.executor.Executor$TaskRunner$$Lambda$7109/1929148825 from org.apache.spark.executor.Executor$TaskRunner]
[Loaded org.apache.spark.executor.Executor$TaskRunner$$Lambda$7110/1884634636 from org.apache.spark.executor.Executor$TaskRunner]
[Loaded org.apache.spark.ThrowableSerializationWrapper from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.ExceptionFailure$ from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor117 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor118 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor119 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor120 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor121 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor122 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor123 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor124 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor125 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor126 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor127 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor128 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor129 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedSerializationConstructorAccessor130 from __JVM_DefineClass__]
[Loaded org.apache.spark.executor.Executor$TaskRunner$$Lambda$7111/2034319924 from org.apache.spark.executor.Executor$TaskRunner]
[Loaded org.apache.spark.UnknownReason$ from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded java.lang.invoke.LambdaForm$DMH/932682077 from java.lang.invoke.LambdaForm]
[Loaded org.apache.spark.scheduler.TaskResultGetter$$Lambda$7112/620349470 from org.apache.spark.scheduler.TaskResultGetter]
[Loaded org.apache.spark.scheduler.TaskResultGetter$$Lambda$7113/1628001528 from org.apache.spark.scheduler.TaskResultGetter]
[Loaded org.apache.spark.TaskOutputFileAlreadyExistException from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.scheduler.TaskSetManager$$Lambda$7114/787725458 from org.apache.spark.scheduler.TaskSetManager]
[Loaded org.apache.spark.ExceptionFailure$$Lambda$7115/100041504 from org.apache.spark.ExceptionFailure]
[Loaded org.apache.spark.scheduler.TaskSetManager$$Lambda$7116/529995465 from org.apache.spark.scheduler.TaskSetManager]
[Loaded org.apache.spark.scheduler.OutputCommitCoordinator$$Lambda$7117/509243321 from org.apache.spark.scheduler.OutputCommitCoordinator]
[Loaded org.apache.spark.scheduler.TaskSetManager$$Lambda$7118/2105046988 from org.apache.spark.scheduler.TaskSetManager]
[Loaded org.apache.spark.scheduler.TaskSetFailed from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.util.JsonProtocol$$$Lambda$7119/1294823382 from org.apache.spark.util.JsonProtocol$]
[Loaded org.apache.spark.util.JsonProtocol$$$Lambda$7120/1217556862 from org.apache.spark.util.JsonProtocol$]
[Loaded org.apache.spark.sql.execution.ui.SQLAppStatusListener$$Lambda$7121/1664361291 from org.apache.spark.sql.execution.ui.SQLAppStatusListener]
[Loaded org.apache.spark.sql.execution.ui.LiveStageMetrics$$Lambda$7122/2109813548 from org.apache.spark.sql.execution.ui.LiveStageMetrics]
[Loaded java.lang.invoke.LambdaForm$DMH/488559754 from java.lang.invoke.LambdaForm]
[Loaded org.apache.spark.sql.execution.ui.LiveStageMetrics$$Lambda$7123/1025909227 from org.apache.spark.sql.execution.ui.LiveStageMetrics]
[Loaded java.lang.invoke.LambdaForm$DMH/2135581567 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$DMH/199653353 from java.lang.invoke.LambdaForm]
[Loaded java.lang.invoke.LambdaForm$MH/1622624411 from java.lang.invoke.LambdaForm]
[Loaded org.apache.spark.scheduler.DAGScheduler$$Lambda$7125/1235947940 from org.apache.spark.scheduler.DAGScheduler]
[Loaded org.apache.spark.util.JsonProtocol$$$Lambda$7124/1243721173 from org.apache.spark.util.JsonProtocol$]
[Loaded org.apache.spark.util.JsonProtocol$$$Lambda$7126/748297648 from org.apache.spark.util.JsonProtocol$]
[Loaded org.apache.spark.scheduler.DAGScheduler$$Lambda$7127/354219782 from org.apache.spark.scheduler.DAGScheduler]
[Loaded org.apache.spark.util.JsonProtocol$$$Lambda$7128/175080490 from org.apache.spark.util.JsonProtocol$]
[Loaded org.apache.spark.scheduler.DAGScheduler$$Lambda$7129/280303026 from org.apache.spark.scheduler.DAGScheduler]
[Loaded org.apache.spark.util.JsonProtocol$$$Lambda$7130/1370299800 from org.apache.spark.util.JsonProtocol$]
[Loaded org.apache.spark.scheduler.TaskSchedulerImpl$$Lambda$7131/1574930362 from org.apache.spark.scheduler.TaskSchedulerImpl]
[Loaded org.apache.spark.scheduler.TaskSchedulerImpl$$Lambda$7132/1290308500 from org.apache.spark.scheduler.TaskSchedulerImpl]
[Loaded org.apache.spark.scheduler.DAGScheduler$$Lambda$7133/1044312520 from org.apache.spark.scheduler.DAGScheduler]
[Loaded org.apache.spark.sql.execution.ui.SQLAppStatusListener$$Lambda$7134/1813120031 from org.apache.spark.sql.execution.ui.SQLAppStatusListener]
[Loaded org.apache.spark.scheduler.DAGScheduler$$Lambda$7135/617353680 from org.apache.spark.scheduler.DAGScheduler]
[Loaded org.apache.spark.scheduler.JobFailed from file:/databricks/jars/----workspace_spark_3_2--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.spark.sql.execution.SQLExecution$$$Lambda$7136/1519693039 from org.apache.spark.sql.execution.SQLExecution$]
[Loaded org.apache.spark.sql.execution.SQLExecution$$$Lambda$7137/2107630640 from org.apache.spark.sql.execution.SQLExecution$]
[Loaded org.apache.spark.sql.execution.SQLExecution$$$Lambda$7138/2050841194 from org.apache.spark.sql.execution.SQLExecution$]
[Loaded org.apache.spark.sql.execution.SQLExecution$$$Lambda$7140/1052477036 from org.apache.spark.sql.execution.SQLExecution$]
[Loaded org.apache.spark.sql.execution.SQLExecution$$$Lambda$7141/1094170240 from org.apache.spark.sql.execution.SQLExecution$]
[Loaded org.apache.spark.util.JsonProtocol$$$Lambda$7139/371543421 from org.apache.spark.util.JsonProtocol$]
[Loaded scala.tools.nsc.util.Exceptional$ from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.scala-lang--scala-compiler_2.12--org.scala-lang__scala-compiler__2.12.14.jar]
[Loaded org.apache.spark.util.JsonProtocol$$$Lambda$7142/1781441940 from org.apache.spark.util.JsonProtocol$]
[Loaded com.databricks.backend.daemon.driver.ScalaDriverLocal$$Lambda$7143/824175639 from com.databricks.backend.daemon.driver.ScalaDriverLocal]
[Loaded org.apache.spark.util.JsonProtocol$$$Lambda$7144/471216687 from org.apache.spark.util.JsonProtocol$]
[Loaded org.apache.spark.util.JsonProtocol$$$Lambda$7145/1624994824 from org.apache.spark.util.JsonProtocol$]
[Loaded org.apache.spark.status.AppStatusListener$$Lambda$7147/1089454969 from org.apache.spark.status.AppStatusListener]
[Loaded org.apache.spark.util.JsonProtocol$$$Lambda$7148/1827518867 from org.apache.spark.util.JsonProtocol$]
[Loaded org.apache.spark.sql.execution.ui.LiveStageMetrics$$Lambda$7149/2022620594 from org.apache.spark.sql.execution.ui.LiveStageMetrics]
[Loaded com.databricks.backend.daemon.driver.ScalaDriverLocal$$Lambda$7150/1306582786 from com.databricks.backend.daemon.driver.ScalaDriverLocal]
[Loaded com.databricks.backend.daemon.driver.ScalaDriverLocal$$Lambda$7151/1431576232 from com.databricks.backend.daemon.driver.ScalaDriverLocal]
[Loaded org.apache.spark.util.JsonProtocol$$$Lambda$7146/636399080 from org.apache.spark.util.JsonProtocol$]
[Loaded com.databricks.backend.daemon.driver.ScalaDriverLocal$$Lambda$7152/706291313 from com.databricks.backend.daemon.driver.ScalaDriverLocal]
[Loaded com.databricks.backend.daemon.driver.ScalaDriverLocal$$Lambda$7153/1688733809 from com.databricks.backend.daemon.driver.ScalaDriverLocal]
[Loaded org.apache.spark.sql.execution.ui.SQLAppStatusListener$$Lambda$7154/573030824 from org.apache.spark.sql.execution.ui.SQLAppStatusListener]
[Loaded org.apache.spark.sql.execution.ui.SQLAppStatusListener$$Lambda$7155/1621356448 from org.apache.spark.sql.execution.ui.SQLAppStatusListener]
[Loaded org.apache.spark.sql.execution.ui.SQLAppStatusListener$$Lambda$7156/1833811243 from org.apache.spark.sql.execution.ui.SQLAppStatusListener]
[Loaded java.util.concurrent.ConcurrentHashMap$KeySpliterator from /usr/lib/jvm/zulu8-ca-amd64/jre/lib/rt.jar]
[Loaded org.apache.spark.util.kvstore.InMemoryStore$InMemoryView$$Lambda$7157/1912282027 from org.apache.spark.util.kvstore.InMemoryStore$InMemoryView]
[Loaded org.apache.commons.lang3.exception.ExceptionUtils from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.apache.commons--commons-lang3--org.apache.commons__commons-lang3__3.12.0.jar]
[Loaded java.util.stream.Stream$Builder from /usr/lib/jvm/zulu8-ca-amd64/jre/lib/rt.jar]
[Loaded java.util.stream.Streams$AbstractStreamBuilderImpl from /usr/lib/jvm/zulu8-ca-amd64/jre/lib/rt.jar]
[Loaded java.util.stream.Streams$StreamBuilderImpl from /usr/lib/jvm/zulu8-ca-amd64/jre/lib/rt.jar]
[Loaded com.databricks.backend.daemon.driver.ScalaDriverLocal$$Lambda$7158/1815781472 from com.databricks.backend.daemon.driver.ScalaDriverLocal]
[Loaded com.databricks.backend.daemon.driver.ScalaDriverLocal$$Lambda$7159/1342173934 from com.databricks.backend.daemon.driver.ScalaDriverLocal]
[Loaded com.databricks.sql.logging.TaskMetrics from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.logging.TaskMetrics$InputMetrics from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.logging.TaskMetrics$OutputMetrics from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.logging.TaskMetrics$ShuffleReadMetrics from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.logging.TaskMetrics$ShuffleWriteMetrics from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded org.apache.commons.lang3.StringUtils from file:/databricks/jars/----workspace_spark_3_2--maven-trees--hive-2.3__hadoop-3.2--org.apache.commons--commons-lang3--org.apache.commons__commons-lang3__3.12.0.jar]
[Loaded com.databricks.sql.logging.QueryProfile$$$Lambda$7160/873161009 from com.databricks.sql.logging.QueryProfile$]
[Loaded com.databricks.sql.logging.TaskMetrics$ from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.logging.TaskMetrics$InputMetrics$ from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.logging.TaskMetrics$OutputMetrics$ from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.logging.TaskMetrics$ShuffleReadMetrics$ from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded java.lang.Character$Subset from /usr/lib/jvm/zulu8-ca-amd64/jre/lib/rt.jar]
[Loaded java.lang.Character$UnicodeBlock from /usr/lib/jvm/zulu8-ca-amd64/jre/lib/rt.jar]
[Loaded com.databricks.sql.logging.TaskMetrics$ShuffleWriteMetrics$ from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.logging.QueryProfile$$$Lambda$7161/2075644835 from com.databricks.sql.logging.QueryProfile$]
[Loaded com.databricks.sql.logging.QueryProfile$$$Lambda$7162/520312571 from com.databricks.sql.logging.QueryProfile$]
[Loaded com.databricks.sql.logging.QueryProfile$$$Lambda$7163/1867046345 from com.databricks.sql.logging.QueryProfile$]
[Loaded com.databricks.sql.logging.StageData$TaskSample from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.logging.QueryProfile$$$Lambda$7164/305358573 from com.databricks.sql.logging.QueryProfile$]
[Loaded java.util.regex.Pattern$Block from /usr/lib/jvm/zulu8-ca-amd64/jre/lib/rt.jar]
[Loaded com.databricks.sql.serialization.marshalling.package$$$Lambda$7165/672578263 from com.databricks.sql.serialization.marshalling.package$]
[Loaded com.databricks.sql.serialization.marshalling.package$$$Lambda$7166/1902471153 from com.databricks.sql.serialization.marshalling.package$]
[Loaded com.databricks.sql.serialization.protobuf.ChildExpression$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.SortOrder$NullOrdering from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.SortOrder$NullOrdering$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.SortOrder$Direction from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.backend.daemon.driver.secretredaction.SecretRedactionContextImpl$$Lambda$7167/688477549 from com.databricks.backend.daemon.driver.secretredaction.SecretRedactionContextImpl]
[Loaded com.databricks.sql.serialization.protobuf.SortOrder$Direction$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.SortOrder$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.SortOrder$Builder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.backend.daemon.driver.CommandIdPattern from file:/databricks/jars/spark--driver--common--driver-common-spark_3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.serialization.marshalling.SingleTagSparkPlanMarshaller$$Lambda$7168/615716297 from com.databricks.sql.serialization.marshalling.SingleTagSparkPlanMarshaller]
[Loaded com.databricks.backend.daemon.driver.DatabricksStreamingQueryListenerBase$$Lambda$7169/1629560823 from com.databricks.backend.daemon.driver.DatabricksStreamingQueryListenerBase]
[Loaded com.databricks.backend.daemon.driver.DatabricksStreamingQueryListenerBase$$Lambda$7170/1966422361 from com.databricks.backend.daemon.driver.DatabricksStreamingQueryListenerBase]
[Loaded com.databricks.sql.serialization.protobuf.StageLink$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded java.lang.UnknownError from /usr/lib/jvm/zulu8-ca-amd64/jre/lib/rt.jar]
[Loaded com.databricks.sql.serialization.protobuf.StageLink$Builder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.WholeStageCodegenExec$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.WholeStageCodegenExec$Builder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.logging.StageData$ from file:/databricks/jars/----workspace_spark_3_2--sql--core--core-hive-2.3__hadoop-3.2_2.12_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetricsOrBuilder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.StageData$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.StageData$StageStatus from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.StageData$StageStatus$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$ShuffleReadMetricsOrBuilder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$ShuffleWriteMetricsOrBuilder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$InputMetricsOrBuilder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$OutputMetricsOrBuilder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$InputMetrics from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$InputMetrics$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$OutputMetrics from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$OutputMetrics$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$ShuffleReadMetrics from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$ShuffleReadMetrics$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$ShuffleWriteMetrics from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$ShuffleWriteMetrics$1 from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.StageData$Builder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.StageData$TaskSampleOrBuilder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.StageData$TaskSample from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$Builder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$InputMetrics$Builder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$OutputMetrics$Builder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$ShuffleReadMetrics$Builder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.serialization.protobuf.TaskMetrics$ShuffleWriteMetrics$Builder from file:/databricks/jars/----workspace_spark_3_2--sql--core--proto_deploy.jar]
[Loaded com.databricks.sql.logging.StageData$$$Lambda$7171/1844234081 from com.databricks.sql.logging.StageData$]
[Loaded com.databricks.sql.logging.StageData$$$Lambda$7172/812001189 from com.databricks.sql.logging.StageData$]
[Loaded com.databricks.sql.logging.StageData$$$Lambda$7173/1683367868 from com.databricks.sql.logging.StageData$]
[Loaded com.databricks.sql.logging.StageData$$$Lambda$7174/1862939544 from com.databricks.sql.logging.StageData$]
[Loaded com.databricks.sql.logging.StageData$$$Lambda$7175/1285776775 from com.databricks.sql.logging.StageData$]
[Loaded com.databricks.sql.logging.StageData$$$Lambda$7176/1308174521 from com.databricks.sql.logging.StageData$]
[Loaded org.apache.spark.storage.StorageStatus$$Lambda$7177/1641773141 from org.apache.spark.storage.StorageStatus]
[Loaded sun.reflect.GeneratedConstructorAccessor88 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedConstructorAccessor89 from __JVM_DefineClass__]
[Loaded org.apache.spark.executor.ExecutorMetricsPoller$$Lambda$7178/875275105 from org.apache.spark.executor.ExecutorMetricsPoller]
[Loaded sun.reflect.GeneratedMethodAccessor159 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedMethodAccessor160 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedMethodAccessor161 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedMethodAccessor162 from __JVM_DefineClass__]
[Loaded sun.reflect.GeneratedMethodAccessor163 from __JVM_DefineClass__]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment