Skip to content

Instantly share code, notes, and snippets.

@avilay
Created April 25, 2021 00:36
Show Gist options
  • Save avilay/bd793bc75c970b1f572fccce3c330da8 to your computer and use it in GitHub Desktop.
Save avilay/bd793bc75c970b1f572fccce3c330da8 to your computer and use it in GitHub Desktop.
reagent-error-3
(py37) ॐ ReAgent git:(master) ✗ $ ./reagent/workflow/cli.py run reagent.workflow.gym_batch_rl.timeline_operator $CONFIG
I0424 173505.209 dataclasses.py:48] USE_VANILLA_DATACLASS: False
I0424 173505.209 dataclasses.py:49] ARBITRARY_TYPES_ALLOWED: True
I0424 173505.210 registry_meta.py:19] Adding REGISTRY to type TrainingReport
I0424 173505.210 registry_meta.py:40] Not Registering TrainingReport to TrainingReport. Abstract method [] are not implemented.
I0424 173505.210 registry_meta.py:19] Adding REGISTRY to type PublishingResult
I0424 173505.210 registry_meta.py:40] Not Registering PublishingResult to PublishingResult. Abstract method [] are not implemented.
I0424 173505.210 registry_meta.py:19] Adding REGISTRY to type ValidationResult
I0424 173505.210 registry_meta.py:40] Not Registering ValidationResult to ValidationResult. Abstract method [] are not implemented.
I0424 173505.211 registry_meta.py:31] Registering NoPublishingResults to PublishingResult
I0424 173505.211 registry_meta.py:34] Using no_publishing_results instead of NoPublishingResults
I0424 173505.211 registry_meta.py:31] Registering NoValidationResults to ValidationResult
I0424 173505.211 registry_meta.py:34] Using no_validation_results instead of NoValidationResults
I0424 173505.237 registry_meta.py:19] Adding REGISTRY to type LearningRateSchedulerConfig
I0424 173505.237 registry_meta.py:40] Not Registering LearningRateSchedulerConfig to LearningRateSchedulerConfig. Abstract method [] are not implemented.
I0424 173505.238 registry_meta.py:19] Adding REGISTRY to type OptimizerConfig
I0424 173505.238 registry_meta.py:40] Not Registering OptimizerConfig to OptimizerConfig. Abstract method [] are not implemented.
I0424 173505.238 registry_meta.py:31] Registering Adam to OptimizerConfig
I0424 173505.240 registry_meta.py:31] Registering SGD to OptimizerConfig
I0424 173505.241 registry_meta.py:31] Registering AdamW to OptimizerConfig
I0424 173505.242 registry_meta.py:31] Registering SparseAdam to OptimizerConfig
I0424 173505.243 registry_meta.py:31] Registering Adamax to OptimizerConfig
I0424 173505.244 registry_meta.py:31] Registering LBFGS to OptimizerConfig
I0424 173505.245 registry_meta.py:31] Registering Rprop to OptimizerConfig
I0424 173505.246 registry_meta.py:31] Registering ASGD to OptimizerConfig
I0424 173505.247 registry_meta.py:31] Registering Adadelta to OptimizerConfig
I0424 173505.248 registry_meta.py:31] Registering Adagrad to OptimizerConfig
I0424 173505.249 registry_meta.py:31] Registering RMSprop to OptimizerConfig
I0424 173505.948 registry_meta.py:19] Adding REGISTRY to type EnvWrapper
I0424 173505.948 registry_meta.py:40] Not Registering EnvWrapper to EnvWrapper. Abstract method ['make', 'serving_obs_preprocessor', 'obs_preprocessor'] are not implemented.
I0424 173505.948 registry_meta.py:31] Registering ChangingArms to EnvWrapper
I0424 173505.956 registry_meta.py:31] Registering Gym to EnvWrapper
I0424 173505.959 utils.py:18] Registering id=Pocman-v0, entry_point=reagent.gym.envs.pomdp.pocman:PocManEnv.
I0424 173505.959 utils.py:18] Registering id=StringGame-v0, entry_point=reagent.gym.envs.pomdp.string_game:StringGameEnv.
I0424 173505.959 utils.py:18] Registering id=LinearDynamics-v0, entry_point=reagent.gym.envs.dynamics.linear_dynamics:LinDynaEnv.
I0424 173505.959 utils.py:18] Registering id=PossibleActionsMaskTester-v0, entry_point=reagent.gym.envs.functionality.possible_actions_mask_tester:PossibleActionsMaskTester.
I0424 173505.959 utils.py:18] Registering id=StringGame-v1, entry_point=reagent.gym.envs.pomdp.string_game_v1:StringGameEnvV1.
I0424 173505.973 registry_meta.py:31] Registering RecSim to EnvWrapper
I0424 173505.975 registry_meta.py:31] Registering OraclePVM to EnvWrapper
I0424 173505.976 registry_meta.py:31] Registering ToyVM to EnvWrapper
I0424 173506.001 registry_meta.py:31] Registering DQNTrainingReport to TrainingReport
I0424 173506.001 registry_meta.py:34] Using dqn_report instead of DQNTrainingReport
I0424 173506.002 registry_meta.py:31] Registering ActorCriticTrainingReport to TrainingReport
I0424 173506.002 registry_meta.py:34] Using actor_critic_report instead of ActorCriticTrainingReport
I0424 173506.002 registry_meta.py:31] Registering WorldModelTrainingReport to TrainingReport
I0424 173506.002 registry_meta.py:34] Using world_model_report instead of WorldModelTrainingReport
I0424 173506.003 registry_meta.py:31] Registering ParametricDQNTrainingReport to TrainingReport
I0424 173506.003 registry_meta.py:34] Using parametric_dqn_report instead of ParametricDQNTrainingReport
I0424 173506.003 registry_meta.py:31] Registering SlateQTrainingReport to TrainingReport
I0424 173506.003 registry_meta.py:34] Using slate_q_report instead of SlateQTrainingReport
I0424 173506.003 registry_meta.py:31] Registering Seq2RewardTrainingReport to TrainingReport
I0424 173506.003 registry_meta.py:34] Using seq2reward_report instead of Seq2RewardTrainingReport
I0424 173506.004 registry_meta.py:19] Adding REGISTRY to type ModelFeatureConfigProvider
I0424 173506.004 registry_meta.py:40] Not Registering ModelFeatureConfigProvider to ModelFeatureConfigProvider. Abstract method ['get_model_feature_config'] are not implemented.
I0424 173506.004 registry_meta.py:31] Registering RawModelFeatureConfigProvider to ModelFeatureConfigProvider
I0424 173506.004 registry_meta.py:34] Using raw instead of RawModelFeatureConfigProvider
I0424 173506.177 registry_meta.py:19] Adding REGISTRY to type ModelPublisher
I0424 173506.177 registry_meta.py:40] Not Registering ModelPublisher to ModelPublisher. Abstract method ['do_publish'] are not implemented.
I0424 173506.179 registry_meta.py:31] Registering FileSystemPublisher to ModelPublisher
I0424 173506.180 registry_meta.py:31] Registering NoPublishing to ModelPublisher
I0424 173506.287 spark_utils.py:55] Building with config:
{'spark.app.name': 'ReAgent',
'spark.driver.extraClassPath': '/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/preprocessing/target/rl-preprocessing-1.1.jar',
'spark.driver.host': '127.0.0.1',
'spark.master': 'local[*]',
'spark.sql.catalogImplementation': 'hive',
'spark.sql.execution.arrow.enabled': 'true',
'spark.sql.session.timeZone': 'UTC',
'spark.sql.shuffle.partitions': '12',
'spark.sql.warehouse.dir': '/Users/avilayparekh/projects/cloned/ReAgent/spark-warehouse'}
21/04/24 17:35:07 WARN Utils: Your hostname, Trantor.local resolves to a loopback address: 127.0.0.1; using 192.168.0.130 instead (on interface en0)
21/04/24 17:35:07 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:261)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:791)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2422)
at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:79)
at org.apache.spark.deploy.SparkSubmit.secMgr$lzycompute$1(SparkSubmit.scala:348)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$secMgr$1(SparkSubmit.scala:348)
at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:356)
at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:356)
at scala.Option.map(Option.scala:146)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:355)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:774)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
at java.base/java.lang.String.checkBoundsBeginEnd(Unknown Source)
at java.base/java.lang.String.substring(Unknown Source)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:52)
... 25 more
Traceback (most recent call last):
File "./reagent/workflow/cli.py", line 87, in <module>
reagent()
File "/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/click/core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "./reagent/workflow/cli.py", line 75, in run
func(**config.asdict())
File "/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/reagent/workflow/gym_batch_rl.py", line 72, in timeline_operator
spark = get_spark_session()
File "/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/reagent/data/spark_utils.py", line 60, in get_spark_session
spark = spark.getOrCreate()
File "/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/pyspark/sql/session.py", line 173, in getOrCreate
sc = SparkContext.getOrCreate(sparkConf)
File "/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/pyspark/context.py", line 367, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/pyspark/context.py", line 133, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/pyspark/context.py", line 316, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File "/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/pyspark/java_gateway.py", line 46, in launch_gateway
return _launch_gateway(conf)
File "/Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/pyspark/java_gateway.py", line 108, in _launch_gateway
raise Exception("Java gateway process exited before sending its port number")
Exception: Java gateway process exited before sending its port number
> /Users/avilayparekh/opt/anaconda3/envs/py37/lib/python3.7/site-packages/pyspark/java_gateway.py(108)_launch_gateway()
-> raise Exception("Java gateway process exited before sending its port number")
(Pdb)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment