Created
May 1, 2017 21:42
-
-
Save bishtgautam/70df5138c4ff76505fcddd9d85d38415 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
>module list | |
Currently Loaded Modulefiles: | |
1) eswrap/1.3.3-1.020200.1278.0 5) udreg/2.3.2-1.0502.10518.2.17.gem 9) gni-headers/4.0-1.0502.10859.7.8.gem 13) rca/1.0.0-2.0502.60530.1.63.gem 17) craype-interlagos 21) modulator/1.2.0 | |
2) craype-network-gemini 6) ugni/6.0-1.0502.10863.8.28.gem 10) xpmem/0.1-2.0502.64982.5.3.gem 14) atp/2.0.5 18) lustredu/1.4 22) hsi/5.0.2.p1 | |
3) pgi/16.10.0 7) pmi/5.0.11 11) dvs/2.5_0.9.0-1.0502.2188.1.113.gem 15) PrgEnv-pgi/5.2.82 19) xalt/0.7.5 23) DefApps | |
4) craype/2.5.9 8) dmapp/7.0.1-1.0502.11080.8.74.gem 12) alps/5.2.4-2.0502.9774.31.12.gem 16) cray-mpich/7.5.2 20) module_msg/0.1 24) python/2.7.9 | |
>./scripts_regression_tests.py -v -b | |
Testing commit 0d320ec57666eb139a38be5ab314825be581c23b | |
Using cime_model = acme | |
Testing machine = titan | |
Test root: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 | |
pylint version 1.5 or newer not found, pylint tests skipped | |
test_CIMEXML_doctests (__main__.A_RunUnitTests) ... ok | |
test_CIME_doctests (__main__.A_RunUnitTests) ... ok | |
test_resolve_variable_name (__main__.A_RunUnitTests) ... ok | |
test_unittests (__main__.A_RunUnitTests) ... ............. | |
---------------------------------------------------------------------- | |
Ran 13 tests in 0.018s | |
OK | |
ok | |
test_script_is_callable (__main__.G_TestMacrosBasic) | |
The test script can be called on valid output without dying. ... ok | |
test_script_rejects_bad_build_system (__main__.G_TestMacrosBasic) | |
The macro writer rejects a bad build system string. ... ok | |
test_script_rejects_bad_xml (__main__.G_TestMacrosBasic) | |
The macro writer rejects input that's not valid XML. ... ok | |
test_append_flags (__main__.H_TestMakeMacros) | |
Test appending flags to a list. ... ok | |
test_append_flags_without_base (__main__.H_TestMakeMacros) | |
Test appending flags to a value set before Macros is included. ... ok | |
test_base_flags (__main__.H_TestMakeMacros) | |
Test that we get "base" compiler flags. ... ok | |
test_build_time_append_flags (__main__.H_TestMakeMacros) | |
Test build_time selection of compiler flags. ... ok | |
test_build_time_attribute (__main__.H_TestMakeMacros) | |
The macro writer writes conditionals for build-time choices. ... ok | |
test_build_time_base_flags (__main__.H_TestMakeMacros) | |
Test selection of base flags based on build-time attributes. ... ok | |
test_build_time_base_flags_same_parent (__main__.H_TestMakeMacros) | |
Test selection of base flags in the same parent element. ... ok | |
test_compiler_changeable_at_build_time (__main__.H_TestMakeMacros) | |
The macro writer writes information for multiple compilers. ... ok | |
test_config_reject_cyclical_references (__main__.H_TestMakeMacros) | |
Test that cyclical <var> references are rejected. ... ok | |
test_config_reject_self_references (__main__.H_TestMakeMacros) | |
Test that <var> self-references are rejected. ... ok | |
test_config_variable_insertion (__main__.H_TestMakeMacros) | |
Test that <var> elements insert variables from config_build. ... ok | |
test_env_and_shell_command (__main__.H_TestMakeMacros) | |
Test that <env> elements work inside <shell> elements. ... ok | |
test_environment_variable_insertion (__main__.H_TestMakeMacros) | |
Test that <env> elements insert environment variables. ... ok | |
test_generic_item (__main__.H_TestMakeMacros) | |
The macro writer can write out a single generic item. ... ok | |
test_ignore_non_match (__main__.H_TestMakeMacros) | |
The macro writer ignores an entry with the wrong machine name. ... ok | |
test_mach_and_os_beats_mach (__main__.H_TestMakeMacros) | |
The macro writer chooses the most-specific match possible. ... ok | |
test_mach_beats_os (__main__.H_TestMakeMacros) | |
The macro writer chooses machine-specific over os-specific matches. ... ok | |
test_machine_specific_append_flags (__main__.H_TestMakeMacros) | |
Test appending flags that are either more or less machine-specific. ... ok | |
test_machine_specific_base_and_append_flags (__main__.H_TestMakeMacros) | |
Test that machine-specific base flags coexist with machine-specific append flags. ... ok | |
test_machine_specific_base_flags (__main__.H_TestMakeMacros) | |
Test selection among base compiler flag sets based on machine. ... ok | |
test_machine_specific_base_over_append_flags (__main__.H_TestMakeMacros) | |
Test that machine-specific base flags override default append flags. ... ok | |
test_machine_specific_item (__main__.H_TestMakeMacros) | |
The macro writer can pick out a machine-specific item. ... ok | |
test_multiple_shell_commands (__main__.H_TestMakeMacros) | |
Test that more than one <shell> element can be used. ... ok | |
test_os_specific_item (__main__.H_TestMakeMacros) | |
The macro writer can pick out an OS-specific item. ... ok | |
test_reject_ambiguous (__main__.H_TestMakeMacros) | |
The macro writer dies if given an ambiguous set of matches. ... ok | |
test_reject_duplicate_defaults (__main__.H_TestMakeMacros) | |
The macro writer dies if given many defaults. ... ok | |
test_reject_duplicates (__main__.H_TestMakeMacros) | |
The macro writer dies if given many matches for a given configuration. ... ok | |
test_shell_command_insertion (__main__.H_TestMakeMacros) | |
Test that <shell> elements insert shell command output. ... ok | |
test_variable_insertion_with_machine_specific_setting (__main__.H_TestMakeMacros) | |
Test that machine-specific <var> dependencies are correct. ... ok | |
test_append_flags (__main__.I_TestCMakeMacros) | |
Test appending flags to a list. ... ok | |
test_append_flags_without_base (__main__.I_TestCMakeMacros) | |
Test appending flags to a value set before Macros is included. ... ok | |
test_base_flags (__main__.I_TestCMakeMacros) | |
Test that we get "base" compiler flags. ... ok | |
test_build_time_append_flags (__main__.I_TestCMakeMacros) | |
Test build_time selection of compiler flags. ... ok | |
test_build_time_attribute (__main__.I_TestCMakeMacros) | |
The macro writer writes conditionals for build-time choices. ... ok | |
test_build_time_base_flags (__main__.I_TestCMakeMacros) | |
Test selection of base flags based on build-time attributes. ... ok | |
test_build_time_base_flags_same_parent (__main__.I_TestCMakeMacros) | |
Test selection of base flags in the same parent element. ... ok | |
test_compiler_changeable_at_build_time (__main__.I_TestCMakeMacros) | |
The macro writer writes information for multiple compilers. ... ok | |
test_config_reject_cyclical_references (__main__.I_TestCMakeMacros) | |
Test that cyclical <var> references are rejected. ... ok | |
test_config_reject_self_references (__main__.I_TestCMakeMacros) | |
Test that <var> self-references are rejected. ... ok | |
test_config_variable_insertion (__main__.I_TestCMakeMacros) | |
Test that <var> elements insert variables from config_build. ... ok | |
test_env_and_shell_command (__main__.I_TestCMakeMacros) | |
Test that <env> elements work inside <shell> elements. ... FAIL | |
test_environment_variable_insertion (__main__.I_TestCMakeMacros) | |
Test that <env> elements insert environment variables. ... ok | |
test_generic_item (__main__.I_TestCMakeMacros) | |
The macro writer can write out a single generic item. ... ok | |
test_ignore_non_match (__main__.I_TestCMakeMacros) | |
The macro writer ignores an entry with the wrong machine name. ... ok | |
test_mach_and_os_beats_mach (__main__.I_TestCMakeMacros) | |
The macro writer chooses the most-specific match possible. ... ok | |
test_mach_beats_os (__main__.I_TestCMakeMacros) | |
The macro writer chooses machine-specific over os-specific matches. ... ok | |
test_machine_specific_append_flags (__main__.I_TestCMakeMacros) | |
Test appending flags that are either more or less machine-specific. ... ok | |
test_machine_specific_base_and_append_flags (__main__.I_TestCMakeMacros) | |
Test that machine-specific base flags coexist with machine-specific append flags. ... ok | |
test_machine_specific_base_flags (__main__.I_TestCMakeMacros) | |
Test selection among base compiler flag sets based on machine. ... ok | |
test_machine_specific_base_over_append_flags (__main__.I_TestCMakeMacros) | |
Test that machine-specific base flags override default append flags. ... ok | |
test_machine_specific_item (__main__.I_TestCMakeMacros) | |
The macro writer can pick out a machine-specific item. ... ok | |
test_multiple_shell_commands (__main__.I_TestCMakeMacros) | |
Test that more than one <shell> element can be used. ... FAIL | |
test_os_specific_item (__main__.I_TestCMakeMacros) | |
The macro writer can pick out an OS-specific item. ... ok | |
test_reject_ambiguous (__main__.I_TestCMakeMacros) | |
The macro writer dies if given an ambiguous set of matches. ... ok | |
test_reject_duplicate_defaults (__main__.I_TestCMakeMacros) | |
The macro writer dies if given many defaults. ... ok | |
test_reject_duplicates (__main__.I_TestCMakeMacros) | |
The macro writer dies if given many matches for a given configuration. ... ok | |
test_shell_command_insertion (__main__.I_TestCMakeMacros) | |
Test that <shell> elements insert shell command output. ... FAIL | |
test_variable_insertion_with_machine_specific_setting (__main__.I_TestCMakeMacros) | |
Test that machine-specific <var> dependencies are correct. ... ok | |
test_a_createnewcase (__main__.J_TestCreateNewcase) ... FAIL | |
test_b_user_mods (__main__.J_TestCreateNewcase) ... ok | |
test_c_create_clone_keepexe (__main__.J_TestCreateNewcase) ... FAIL | |
test_d_create_clone_new_user (__main__.J_TestCreateNewcase) ... FAIL | |
test_e_xmlquery (__main__.J_TestCreateNewcase) ... FAIL | |
test_cime_case (__main__.K_TestCimeCase) ... FAIL | |
Stdout: | |
Detected failed test or user request no teardown | |
Leaving files: | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTMEMLEAKPASS_P1.f19_g16.X.titan_pgi.fake_testing_only_20170501_133450 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.submit.fake_testing_only_20170501_133450 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTTESTDIFF_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTBUILDFAILEXC_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNFAILEXC_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNFAIL_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.status.fake_testing_only_20170501_133450 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTMEMLEAKFAIL_P1.f19_g16.X.titan_pgi.fake_testing_only_20170501_133450 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
test_cime_case_force_pecount (__main__.K_TestCimeCase) ... FAIL | |
Stdout: | |
Detected failed test or user request no teardown | |
Leaving files: | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_Mmpi-serial_P16x8.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133511 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.status.fake_testing_only_20170501_133511 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.submit.fake_testing_only_20170501_133511 | |
test_cime_case_mpi_serial (__main__.K_TestCimeCase) ... FAIL | |
Stdout: | |
Detected failed test or user request no teardown | |
Leaving files: | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133523 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.submit.fake_testing_only_20170501_133523 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.status.fake_testing_only_20170501_133523 | |
test_save_timings (__main__.L_TestSaveTimings) ... FAIL | |
Stdout: | |
Detected failed test or user request no teardown | |
Leaving files: | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.status.fake_testing_only_20170501_133531 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133531 | |
test_save_timings_manual (__main__.L_TestSaveTimings) ... FAIL | |
Stdout: | |
Detected failed test or user request no teardown | |
Leaving files: | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133539 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.status.fake_testing_only_20170501_133539 | |
test_wait_for_test_all_pass (__main__.M_TestWaitForTests) ... ok | |
test_wait_for_test_cdash_kill (__main__.M_TestWaitForTests) ... ok | |
test_wait_for_test_cdash_pass (__main__.M_TestWaitForTests) ... ok | |
test_wait_for_test_no_wait (__main__.M_TestWaitForTests) ... ok | |
test_wait_for_test_wait_for_missing_run_phase (__main__.M_TestWaitForTests) ... ok | |
test_wait_for_test_wait_for_pend (__main__.M_TestWaitForTests) ... ok | |
test_wait_for_test_wait_kill (__main__.M_TestWaitForTests) ... ok | |
test_wait_for_test_with_fail (__main__.M_TestWaitForTests) ... ok | |
test_a_unit_test (__main__.N_TestUnitTest) ... skipped 'Skipping TestUnitTest - only supported on yellowstone with intel' | |
test_b_cime_f90_unit_tests (__main__.N_TestUnitTest) ... skipped 'Skipping TestUnitTest - only supported on yellowstone with intel' | |
test_a_phases (__main__.O_TestTestScheduler) ... ok | |
test_b_full (__main__.O_TestTestScheduler) ... FAIL | |
Stdout: | |
Detected failed test or user request no teardown | |
Leaving files: | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133653-20170501_133654 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTTESTDIFF_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133653-20170501_133654 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133653-20170501_133654 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNFAIL_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133653-20170501_133654 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTBUILDFAILEXC_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133653-20170501_133654 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.status.fake_testing_only_20170501_133653-20170501_133654 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNFAILEXC_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133653-20170501_133654 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTMEMLEAKPASS_P1.f19_g16.X.titan_pgi.fake_testing_only_20170501_133653-20170501_133654 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTMEMLEAKFAIL_P1.f19_g16.X.titan_pgi.fake_testing_only_20170501_133653-20170501_133654 | |
test_c_use_existing (__main__.O_TestTestScheduler) ... FAIL | |
Stdout: | |
Detected failed test or user request no teardown | |
Leaving files: | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133723-20170501_133723 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.submit.fake_testing_only_20170501_133723-20170501_133723 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133723-20170501_133723 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.status.fake_testing_only_20170501_133723-20170501_133723 | |
test_jenkins_generic_job (__main__.P_TestJenkinsGenericJob) ... FAIL | |
Stdout: | |
Detected failed test or user request no teardown | |
Leaving files: | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732 | |
test_jenkins_generic_job_kill (__main__.P_TestJenkinsGenericJob) ... ok | |
test_bless_test_results (__main__.Q_TestBlessTestResults) ... FAIL | |
Stdout: | |
Detected failed test or user request no teardown | |
Leaving files: | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNDIFF_P1.f19_g16_rx1.A.titan_pgi.G.fake_testing_only_20170501_134028-20170501_134028 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.status.fake_testing_only_20170501_134028-20170501_134028 | |
test_rebless_namelist (__main__.Q_TestBlessTestResults) ... FAIL | |
Stdout: | |
Detected failed test or user request no teardown | |
Leaving files: | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.submit.fake_testing_only_20170501_134042-20170501_134042 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.G.fake_testing_only_20170501_134042-20170501_134042 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.status.fake_testing_only_20170501_134042-20170501_134042 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi.G.fake_testing_only_20170501_134042-20170501_134042 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi.G.fake_testing_only_20170501_134042-20170501_134042 | |
test_update_acme_tests (__main__.R_TestUpdateACMETests) ... skipped 'Disabling this test until we figure out how to integrate ACME tests and CIME xml files.' | |
test_update_acme_tests_test_mods (__main__.R_TestUpdateACMETests) ... skipped 'Disabling this test until we figure out how to integrate ACME tests and CIME xml files.' | |
test_query_testlists_count_runs (__main__.S_TestManageAndQuery) | |
Make sure that query_testlists runs successfully with the --count argument ... ok | |
test_query_testlists_list_runs (__main__.S_TestManageAndQuery) | |
Make sure that query_testlists runs successfully with the --list argument ... ok | |
test_query_testlists_runs (__main__.S_TestManageAndQuery) | |
Make sure that query_testlists runs successfully ... ok | |
test_single_submit (__main__.X_TestSingleSubmit) ... skipped 'Skipping single submit. Only works on skybridge' | |
test_full_system (__main__.Z_FullSystemTest) ... FAIL | |
Stdout: | |
Detected failed test or user request no teardown | |
Leaving files: | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS_D_Ln9.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERP.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERS_N2.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS.T42_T42.S.titan_pgi.fake_testing_only_20170501_134054 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/NCK_Ld3.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/DAE.f19_f19.A.titan_pgi.fake_testing_only_20170501_134054 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SEQ_Ln9.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERS.ne30_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/cs.status.fake_testing_only_20170501_134054 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERIO.f45_g37.X.titan_pgi.fake_testing_only_20170501_134054 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERI.f45_g37.X.titan_pgi.fake_testing_only_20170501_134054 | |
/ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERR.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
====================================================================== | |
FAIL: test_env_and_shell_command (__main__.I_TestCMakeMacros) | |
Test that <env> elements work inside <shell> elements. | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 1890, in test_env_and_shell_command | |
tester.assert_variable_equals("FFLAGS", "-O2 -fast", env={"OPT_LEVEL": "2"}) | |
File "./scripts_regression_tests.py", line 1599, in assert_variable_equals | |
self.parent.assertEqual(self.query_var(var_name, env, var), value) | |
File "./scripts_regression_tests.py", line 1580, in query_var | |
run_cmd_assert_result(self.parent, "cmake %s . 2>&1" % cmake_args, from_dir=temp_dir, env=environment) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: cmake -DCMAKE_SYSTEM_NAME=Catamount . 2>&1 | |
FROM_DIR: /tmp/tmpxL9JlZ | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 1 | |
OUTPUT: -- The C compiler identification is GNU | |
-- The CXX compiler identification is GNU | |
-- Check for working C compiler: /usr/bin/gcc | |
-- Check for working C compiler: /usr/bin/gcc -- works | |
-- Detecting C compiler ABI info | |
-- Detecting C compiler ABI info - done | |
-- Check for working CXX compiler: /usr/bin/c++ | |
-- Check for working CXX compiler: /usr/bin/c++ -- works | |
-- Detecting CXX compiler ABI info | |
-- Detecting CXX compiler ABI info - done | |
CMake Error at Macros.cmake:3 (unset): | |
Unknown CMake command "unset". | |
Call Stack (most recent call first): | |
CMakeLists.txt:2 (include) | |
CMake Warning (dev) in CMakeLists.txt: | |
No cmake_minimum_required command is present. A line of code such as | |
cmake_minimum_required(VERSION 2.6) | |
should be added at the top of the file. The version specified may be lower | |
if you wish to support older CMake versions for this project. For more | |
information run "cmake --help-policy CMP0000". | |
This warning is for project developers. Use -Wno-dev to suppress it. | |
-- Configuring incomplete, errors occurred! | |
ERRPUT: | |
====================================================================== | |
FAIL: test_multiple_shell_commands (__main__.I_TestCMakeMacros) | |
Test that more than one <shell> element can be used. | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 1881, in test_multiple_shell_commands | |
tester.assert_variable_equals("FFLAGS", "-O2 -fast") | |
File "./scripts_regression_tests.py", line 1599, in assert_variable_equals | |
self.parent.assertEqual(self.query_var(var_name, env, var), value) | |
File "./scripts_regression_tests.py", line 1580, in query_var | |
run_cmd_assert_result(self.parent, "cmake %s . 2>&1" % cmake_args, from_dir=temp_dir, env=environment) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: cmake -DCMAKE_SYSTEM_NAME=Catamount . 2>&1 | |
FROM_DIR: /tmp/tmpCrqz7i | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 1 | |
OUTPUT: -- The C compiler identification is GNU | |
-- The CXX compiler identification is GNU | |
-- Check for working C compiler: /usr/bin/gcc | |
-- Check for working C compiler: /usr/bin/gcc -- works | |
-- Detecting C compiler ABI info | |
-- Detecting C compiler ABI info - done | |
-- Check for working CXX compiler: /usr/bin/c++ | |
-- Check for working CXX compiler: /usr/bin/c++ -- works | |
-- Detecting CXX compiler ABI info | |
-- Detecting CXX compiler ABI info - done | |
CMake Error at Macros.cmake:4 (unset): | |
Unknown CMake command "unset". | |
Call Stack (most recent call first): | |
CMakeLists.txt:2 (include) | |
CMake Warning (dev) in CMakeLists.txt: | |
No cmake_minimum_required command is present. A line of code such as | |
cmake_minimum_required(VERSION 2.6) | |
should be added at the top of the file. The version specified may be lower | |
if you wish to support older CMake versions for this project. For more | |
information run "cmake --help-policy CMP0000". | |
This warning is for project developers. Use -Wno-dev to suppress it. | |
-- Configuring incomplete, errors occurred! | |
ERRPUT: | |
====================================================================== | |
FAIL: test_shell_command_insertion (__main__.I_TestCMakeMacros) | |
Test that <shell> elements insert shell command output. | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 1872, in test_shell_command_insertion | |
tester.assert_variable_equals("FFLAGS", "-O2 -fast") | |
File "./scripts_regression_tests.py", line 1599, in assert_variable_equals | |
self.parent.assertEqual(self.query_var(var_name, env, var), value) | |
File "./scripts_regression_tests.py", line 1580, in query_var | |
run_cmd_assert_result(self.parent, "cmake %s . 2>&1" % cmake_args, from_dir=temp_dir, env=environment) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: cmake -DCMAKE_SYSTEM_NAME=Catamount . 2>&1 | |
FROM_DIR: /tmp/tmpGfGVHN | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 1 | |
OUTPUT: -- The C compiler identification is GNU | |
-- The CXX compiler identification is GNU | |
-- Check for working C compiler: /usr/bin/gcc | |
-- Check for working C compiler: /usr/bin/gcc -- works | |
-- Detecting C compiler ABI info | |
-- Detecting C compiler ABI info - done | |
-- Check for working CXX compiler: /usr/bin/c++ | |
-- Check for working CXX compiler: /usr/bin/c++ -- works | |
-- Detecting CXX compiler ABI info | |
-- Detecting CXX compiler ABI info - done | |
CMake Error at Macros.cmake:3 (unset): | |
Unknown CMake command "unset". | |
Call Stack (most recent call first): | |
CMakeLists.txt:2 (include) | |
CMake Warning (dev) in CMakeLists.txt: | |
No cmake_minimum_required command is present. A line of code such as | |
cmake_minimum_required(VERSION 2.6) | |
should be added at the top of the file. The version specified may be lower | |
if you wish to support older CMake versions for this project. For more | |
information run "cmake --help-policy CMP0000". | |
This warning is for project developers. Use -Wno-dev to suppress it. | |
-- Configuring incomplete, errors occurred! | |
ERRPUT: | |
====================================================================== | |
FAIL: test_a_createnewcase (__main__.J_TestCreateNewcase) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 253, in test_a_createnewcase | |
run_cmd_assert_result(self, "./case.setup", from_dir=testdir) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: ./case.setup | |
FROM_DIR: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TestCreateNewcase/testcreatenewcase | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 1 | |
OUTPUT: | |
ERRPUT: ERROR: module command /opt/modules/default/bin/modulecmd python load cray-libsci/16.06.1 failed with message: | |
cray-libsci/16.06.1(183):ERROR:150: Module 'cray-libsci/16.06.1' conflicts with the currently loaded module(s) 'cray-libsci/16.11.1' | |
cray-libsci/16.06.1(183):ERROR:102: Tcl command execution failed: libsci::check_conflicts | |
====================================================================== | |
FAIL: test_c_create_clone_keepexe (__main__.J_TestCreateNewcase) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 288, in test_c_create_clone_keepexe | |
(SCRIPT_DIR, prevtestdir, testdir),from_dir=SCRIPT_DIR) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/create_clone --clone /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TestCreateNewcase/testcreatenewcase --case /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TestCreateNewcase/test_create_clone_keepexe --keepexe | |
FROM_DIR: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 1 | |
OUTPUT: Creating Case directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TestCreateNewcase/test_create_clone_keepexe | |
Successfully created new case test_create_clone_keepexe from clone case testcreatenewcase | |
ERRPUT: ERROR: module command /opt/modules/default/bin/modulecmd python load cray-libsci/16.06.1 failed with message: | |
cray-libsci/16.06.1(183):ERROR:150: Module 'cray-libsci/16.06.1' conflicts with the currently loaded module(s) 'cray-libsci/16.11.1' | |
cray-libsci/16.06.1(183):ERROR:102: Tcl command execution failed: libsci::check_conflicts | |
====================================================================== | |
FAIL: test_d_create_clone_new_user (__main__.J_TestCreateNewcase) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 312, in test_d_create_clone_new_user | |
(SCRIPT_DIR, prevtestdir, testdir, cls._testroot),from_dir=SCRIPT_DIR) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/create_clone --clone /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TestCreateNewcase/testcreatenewcase --case /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TestCreateNewcase/test_create_clone_new_user --cime-output-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TestCreateNewcase | |
FROM_DIR: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 1 | |
OUTPUT: Creating Case directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TestCreateNewcase/test_create_clone_new_user | |
Successfully created new case test_create_clone_new_user from clone case testcreatenewcase | |
ERRPUT: ERROR: module command /opt/modules/default/bin/modulecmd python load cray-libsci/16.06.1 failed with message: | |
cray-libsci/16.06.1(183):ERROR:150: Module 'cray-libsci/16.06.1' conflicts with the currently loaded module(s) 'cray-libsci/16.11.1' | |
cray-libsci/16.06.1(183):ERROR:102: Tcl command execution failed: libsci::check_conflicts | |
====================================================================== | |
FAIL: test_e_xmlquery (__main__.J_TestCreateNewcase) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 337, in test_e_xmlquery | |
self.assertTrue(output == "TRUE", msg="%s != %s"%(output, BUILD_COMPLETE)) | |
AssertionError: FALSE != False | |
====================================================================== | |
FAIL: test_cime_case (__main__.K_TestCimeCase) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 1208, in test_cime_case | |
% (SCRIPT_DIR, self._baseline_name, TEST_ROOT, TEST_ROOT)) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/create_test cime_test_only -t fake_testing_only_20170501_133450 --no-build --test-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 --output-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 | |
FROM_DIR: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/tests | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 100 | |
OUTPUT: Using project from .cesm_proj: cli112 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTMEMLEAKFAIL_P1.f19_g16.X.titan_pgi.fake_testing_only_20170501_133450 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTTESTDIFF_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNFAIL_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNFAILEXC_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTMEMLEAKPASS_P1.f19_g16.X.titan_pgi.fake_testing_only_20170501_133450 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTBUILDFAILEXC_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
RUNNING TESTS: | |
TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi | |
TESTMEMLEAKFAIL_P1.f19_g16.X.titan_pgi | |
TESTTESTDIFF_P1.f19_g16_rx1.A.titan_pgi | |
TESTRUNFAIL_P1.f19_g16_rx1.A.titan_pgi | |
TESTRUNFAILEXC_P1.f19_g16_rx1.A.titan_pgi | |
TESTMEMLEAKPASS_P1.f19_g16.X.titan_pgi | |
TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi | |
TESTBUILDFAILEXC_P1.f19_g16_rx1.A.titan_pgi | |
Starting CREATE_NEWCASE for test TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test TESTMEMLEAKFAIL_P1.f19_g16.X.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test TESTTESTDIFF_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test TESTRUNFAIL_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test TESTRUNFAILEXC_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test TESTMEMLEAKPASS_P1.f19_g16.X.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test TESTBUILDFAILEXC_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test TESTMEMLEAKFAIL_P1.f19_g16.X.titan_pgi in 4.201761 seconds (PASS) | |
Starting XML for test TESTMEMLEAKFAIL_P1.f19_g16.X.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test TESTRUNFAILEXC_P1.f19_g16_rx1.A.titan_pgi in 4.674630 seconds (PASS) | |
Starting XML for test TESTRUNFAILEXC_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test TESTBUILDFAILEXC_P1.f19_g16_rx1.A.titan_pgi in 4.853255 seconds (PASS) | |
Finished CREATE_NEWCASE for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi in 4.858426 seconds (PASS) | |
Finished CREATE_NEWCASE for test TESTRUNFAIL_P1.f19_g16_rx1.A.titan_pgi in 4.917109 seconds (PASS) | |
Finished CREATE_NEWCASE for test TESTMEMLEAKPASS_P1.f19_g16.X.titan_pgi in 4.911940 seconds (PASS) | |
Finished CREATE_NEWCASE for test TESTTESTDIFF_P1.f19_g16_rx1.A.titan_pgi in 4.937361 seconds (PASS) | |
Finished CREATE_NEWCASE for test TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi in 4.944592 seconds (PASS) | |
Starting XML for test TESTMEMLEAKPASS_P1.f19_g16.X.titan_pgi with 1 procs | |
Starting XML for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting XML for test TESTBUILDFAILEXC_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting XML for test TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting XML for test TESTTESTDIFF_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting XML for test TESTRUNFAIL_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished XML for test TESTMEMLEAKFAIL_P1.f19_g16.X.titan_pgi in 1.775123 seconds (PASS) | |
Starting SETUP for test TESTMEMLEAKFAIL_P1.f19_g16.X.titan_pgi with 1 procs | |
Finished XML for test TESTRUNFAILEXC_P1.f19_g16_rx1.A.titan_pgi in 3.157516 seconds (PASS) | |
Starting SETUP for test TESTRUNFAILEXC_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished XML for test TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi in 3.600726 seconds (PASS) | |
Starting SETUP for test TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished XML for test TESTTESTDIFF_P1.f19_g16_rx1.A.titan_pgi in 3.643111 seconds (PASS) | |
Finished XML for test TESTMEMLEAKPASS_P1.f19_g16.X.titan_pgi in 3.724923 seconds (PASS) | |
Finished XML for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi in 3.784150 seconds (PASS) | |
Finished XML for test TESTBUILDFAILEXC_P1.f19_g16_rx1.A.titan_pgi in 3.860343 seconds (PASS) | |
Starting SETUP for test TESTTESTDIFF_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting SETUP for test TESTMEMLEAKPASS_P1.f19_g16.X.titan_pgi with 1 procs | |
Starting SETUP for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting SETUP for test TESTBUILDFAILEXC_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished XML for test TESTRUNFAIL_P1.f19_g16_rx1.A.titan_pgi in 3.873571 seconds (PASS) | |
Starting SETUP for test TESTRUNFAIL_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished SETUP for test TESTMEMLEAKFAIL_P1.f19_g16.X.titan_pgi in 3.375011 seconds (FAIL). [COMPLETED 1 of 8] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTMEMLEAKFAIL_P1.f19_g16.X.titan_pgi.fake_testing_only_20170501_133450 | |
Finished SETUP for test TESTRUNFAILEXC_P1.f19_g16_rx1.A.titan_pgi in 7.257930 seconds (FAIL). [COMPLETED 2 of 8] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNFAILEXC_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
Finished SETUP for test TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi in 9.892395 seconds (FAIL). [COMPLETED 3 of 8] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
Finished SETUP for test TESTBUILDFAILEXC_P1.f19_g16_rx1.A.titan_pgi in 11.423053 seconds (FAIL). [COMPLETED 4 of 8] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTBUILDFAILEXC_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
Finished SETUP for test TESTMEMLEAKPASS_P1.f19_g16.X.titan_pgi in 11.438194 seconds (FAIL). [COMPLETED 5 of 8] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTMEMLEAKPASS_P1.f19_g16.X.titan_pgi.fake_testing_only_20170501_133450 | |
Finished SETUP for test TESTTESTDIFF_P1.f19_g16_rx1.A.titan_pgi in 11.451198 seconds (FAIL). [COMPLETED 6 of 8] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTTESTDIFF_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
Finished SETUP for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi in 11.440723 seconds (FAIL). [COMPLETED 7 of 8] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
Finished SETUP for test TESTRUNFAIL_P1.f19_g16_rx1.A.titan_pgi in 11.359709 seconds (FAIL). [COMPLETED 8 of 8] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNFAIL_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
At test-scheduler close, state is: | |
FAIL TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTBUILDFAIL_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
FAIL TESTMEMLEAKFAIL_P1.f19_g16.X.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTMEMLEAKFAIL_P1.f19_g16.X.titan_pgi.fake_testing_only_20170501_133450 | |
FAIL TESTTESTDIFF_P1.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTTESTDIFF_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
FAIL TESTRUNFAIL_P1.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNFAIL_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
FAIL TESTRUNFAILEXC_P1.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNFAILEXC_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
FAIL TESTMEMLEAKPASS_P1.f19_g16.X.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTMEMLEAKPASS_P1.f19_g16.X.titan_pgi.fake_testing_only_20170501_133450 | |
FAIL TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
FAIL TESTBUILDFAILEXC_P1.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTBUILDFAILEXC_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133450 | |
test-scheduler took 20.5290908813 seconds | |
ERRPUT: | |
====================================================================== | |
FAIL: test_cime_case_force_pecount (__main__.K_TestCimeCase) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 1265, in test_cime_case_force_pecount | |
% (SCRIPT_DIR, self._baseline_name, self._testroot, self._testroot)) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/create_test TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A -t fake_testing_only_20170501_133511 --no-build --test-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 --output-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 --force-procs 16 --force-threads 8 | |
FROM_DIR: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/tests | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 100 | |
OUTPUT: Using project from .cesm_proj: cli112 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_Mmpi-serial_P16x8.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133511 | |
RUNNING TESTS: | |
TESTRUNPASS_Mmpi-serial_P16x8.f19_g16_rx1.A.titan_pgi | |
Starting CREATE_NEWCASE for test TESTRUNPASS_Mmpi-serial_P16x8.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test TESTRUNPASS_Mmpi-serial_P16x8.f19_g16_rx1.A.titan_pgi in 7.304669 seconds (PASS) | |
Starting XML for test TESTRUNPASS_Mmpi-serial_P16x8.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished XML for test TESTRUNPASS_Mmpi-serial_P16x8.f19_g16_rx1.A.titan_pgi in 0.495853 seconds (PASS) | |
Starting SETUP for test TESTRUNPASS_Mmpi-serial_P16x8.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished SETUP for test TESTRUNPASS_Mmpi-serial_P16x8.f19_g16_rx1.A.titan_pgi in 3.608866 seconds (FAIL). [COMPLETED 1 of 1] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_Mmpi-serial_P16x8.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133511 | |
At test-scheduler close, state is: | |
FAIL TESTRUNPASS_Mmpi-serial_P16x8.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_Mmpi-serial_P16x8.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133511 | |
test-scheduler took 11.8325459957 seconds | |
ERRPUT: | |
====================================================================== | |
FAIL: test_cime_case_mpi_serial (__main__.K_TestCimeCase) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 1247, in test_cime_case_mpi_serial | |
% (SCRIPT_DIR, self._baseline_name, self._testroot, self._testroot)) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/create_test TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A -t fake_testing_only_20170501_133523 --no-build --test-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 --output-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 | |
FROM_DIR: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/tests | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 100 | |
OUTPUT: Using project from .cesm_proj: cli112 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133523 | |
RUNNING TESTS: | |
TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A.titan_pgi | |
Starting CREATE_NEWCASE for test TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A.titan_pgi in 3.047496 seconds (PASS) | |
Starting XML for test TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished XML for test TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A.titan_pgi in 0.549103 seconds (PASS) | |
Starting SETUP for test TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished SETUP for test TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A.titan_pgi in 3.651443 seconds (FAIL). [COMPLETED 1 of 1] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133523 | |
At test-scheduler close, state is: | |
FAIL TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_Mmpi-serial.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133523 | |
test-scheduler took 7.62228798866 seconds | |
ERRPUT: | |
====================================================================== | |
FAIL: test_save_timings (__main__.L_TestSaveTimings) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 1340, in test_save_timings | |
self.simple_test() | |
File "./scripts_regression_tests.py", line 1314, in simple_test | |
run_cmd_assert_result(self, create_test_cmd) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/create_test SMS_Ln9_P1.f19_g16_rx1.A --save-timing --walltime 0:15:00 -t fake_testing_only_20170501_133531 --test-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 --output-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 | |
FROM_DIR: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/tests | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 100 | |
OUTPUT: Using project from .cesm_proj: cli112 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133531 | |
RUNNING TESTS: | |
SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi | |
Starting CREATE_NEWCASE for test SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi in 2.940454 seconds (PASS) | |
Starting XML for test SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished XML for test SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi in 0.492691 seconds (PASS) | |
Starting SETUP for test SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished SETUP for test SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi in 3.435683 seconds (FAIL). [COMPLETED 1 of 1] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133531 | |
Due to presence of batch system, create_test will exit before tests are complete. | |
To force create_test to wait for full completion, use --wait | |
At test-scheduler close, state is: | |
FAIL SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133531 | |
test-scheduler took 7.21913385391 seconds | |
ERRPUT: | |
====================================================================== | |
FAIL: test_save_timings_manual (__main__.L_TestSaveTimings) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 1345, in test_save_timings_manual | |
self.simple_test(manual_timing=True) | |
File "./scripts_regression_tests.py", line 1314, in simple_test | |
run_cmd_assert_result(self, create_test_cmd) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/create_test SMS_Ln9_P1.f19_g16_rx1.A --walltime 0:15:00 -t fake_testing_only_20170501_133539 --test-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 --output-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 | |
FROM_DIR: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/tests | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 100 | |
OUTPUT: Using project from .cesm_proj: cli112 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133539 | |
RUNNING TESTS: | |
SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi | |
Starting CREATE_NEWCASE for test SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi in 2.907993 seconds (PASS) | |
Starting XML for test SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished XML for test SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi in 0.502290 seconds (PASS) | |
Starting SETUP for test SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished SETUP for test SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi in 3.420354 seconds (FAIL). [COMPLETED 1 of 1] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133539 | |
Due to presence of batch system, create_test will exit before tests are complete. | |
To force create_test to wait for full completion, use --wait | |
At test-scheduler close, state is: | |
FAIL SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS_Ln9_P1.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_133539 | |
test-scheduler took 7.21601295471 seconds | |
ERRPUT: | |
====================================================================== | |
FAIL: test_b_full (__main__.O_TestTestScheduler) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 764, in test_b_full | |
self.assertEqual(ts.get_status(CIME.test_scheduler.RUN_PHASE), TEST_PASS_STATUS, msg=test_name) | |
AssertionError: TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi | |
====================================================================== | |
FAIL: test_c_use_existing (__main__.O_TestTestScheduler) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 793, in test_c_use_existing | |
self.assertEqual(ts.get_status(CIME.test_scheduler.MODEL_BUILD_PHASE), TEST_FAIL_STATUS) | |
AssertionError: None != 'FAIL' | |
====================================================================== | |
FAIL: test_jenkins_generic_job (__main__.P_TestJenkinsGenericJob) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 887, in test_jenkins_generic_job | |
self.simple_test(True, "-t cime_test_only_pass -g -b %s" % self._baseline_name) | |
File "./scripts_regression_tests.py", line 849, in simple_test | |
from_dir=self._testdir, expected_stat=(0 if expect_works else CIME.utils.TESTS_FAILED_ERR_CODE)) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/jenkins_generic_job -r /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732 -t cime_test_only_pass -g -b fake_testing_only_20170501_133732 | |
FROM_DIR: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732 | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 100 | |
OUTPUT: RUN: ./create_test cime_test_only_pass --test-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732/jenkins -t jenkins_fake_testing_only_20170501_133732_20170501_133732 -g -b fake_testing_only_20170501_133732 | |
Using project from .cesm_proj: cli112 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732/jenkins/TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi.G.jenkins_fake_testing_only_20170501_133732_20170501_133732 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732/jenkins/TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi.G.jenkins_fake_testing_only_20170501_133732_20170501_133732 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732/jenkins/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.G.jenkins_fake_testing_only_20170501_133732_20170501_133732 | |
RUNNING TESTS: | |
TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi | |
TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi | |
TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi | |
Starting CREATE_NEWCASE for test TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi in 4.234068 seconds (PASS) | |
Finished CREATE_NEWCASE for test TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi in 4.254059 seconds (PASS) | |
Finished CREATE_NEWCASE for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi in 4.257855 seconds (PASS) | |
Starting XML for test TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi with 1 procs | |
Starting XML for test TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi with 1 procs | |
Starting XML for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished XML for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi in 1.966238 seconds (PASS) | |
Starting SETUP for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished XML for test TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi in 2.076888 seconds (PASS) | |
Finished XML for test TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi in 2.137705 seconds (PASS) | |
Starting SETUP for test TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi with 1 procs | |
Starting SETUP for test TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi with 1 procs | |
Finished SETUP for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi in 31.504397 seconds (FAIL). [COMPLETED 1 of 3] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732/jenkins/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.G.jenkins_fake_testing_only_20170501_133732_20170501_133732 | |
Finished SETUP for test TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi in 31.466694 seconds (FAIL). [COMPLETED 2 of 3] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732/jenkins/TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi.G.jenkins_fake_testing_only_20170501_133732_20170501_133732 | |
Finished SETUP for test TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi in 31.484858 seconds (FAIL). [COMPLETED 3 of 3] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732/jenkins/TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi.G.jenkins_fake_testing_only_20170501_133732_20170501_133732 | |
Due to presence of batch system, create_test will exit before tests are complete. | |
To force create_test to wait for full completion, use --wait | |
At test-scheduler close, state is: | |
FAIL TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732/jenkins/TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi.G.jenkins_fake_testing_only_20170501_133732_20170501_133732 | |
FAIL TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732/jenkins/TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi.G.jenkins_fake_testing_only_20170501_133732_20170501_133732 | |
FAIL TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732/jenkins/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.G.jenkins_fake_testing_only_20170501_133732_20170501_133732 | |
test-scheduler took 38.2906999588 seconds | |
stat: 100 | |
Test 'TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi' finished with status 'FAIL' | |
Path: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732/jenkins/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.G.jenkins_fake_testing_only_20170501_133732_20170501_133732/TestStatus | |
Test 'TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi' finished with status 'FAIL' | |
Path: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732/jenkins/TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi.G.jenkins_fake_testing_only_20170501_133732_20170501_133732/TestStatus | |
Test 'TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi' finished with status 'FAIL' | |
Path: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/jenkins_generic_test_subdir_fake_testing_only_20170501_133732/jenkins/TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi.G.jenkins_fake_testing_only_20170501_133732_20170501_133732/TestStatus | |
ERRPUT: Caught exception: ERROR: Fatal error in case.cmpgen_namelists: | |
ERROR: module command /opt/modules/default/bin/modulecmd python load cray-libsci/16.06.1 failed with message: | |
cray-libsci/16.06.1(183):ERROR:150: Module 'cray-libsci/16.06.1' conflicts with the currently loaded module(s) 'cray-libsci/16.11.1' | |
cray-libsci/16.06.1(183):ERROR:102: Tcl command execution failed: libsci::check_conflicts | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 575, in _run_catch_exceptions | |
return run(test) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 544, in _setup_phase | |
expect(cmdstat in [0, TESTS_FAILED_ERR_CODE], "Fatal error in case.cmpgen_namelists: %s" % (output + "\n" + errput)) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/utils.py", line 29, in expect | |
raise exc_type("%s %s" % (error_prefix,error_msg)) | |
Caught exception: ERROR: Fatal error in case.cmpgen_namelists: | |
ERROR: module command /opt/modules/default/bin/modulecmd python load cray-libsci/16.06.1 failed with message: | |
cray-libsci/16.06.1(183):ERROR:150: Module 'cray-libsci/16.06.1' conflicts with the currently loaded module(s) 'cray-libsci/16.11.1' | |
cray-libsci/16.06.1(183):ERROR:102: Tcl command execution failed: libsci::check_conflicts | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 575, in _run_catch_exceptions | |
return run(test) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 544, in _setup_phase | |
expect(cmdstat in [0, TESTS_FAILED_ERR_CODE], "Fatal error in case.cmpgen_namelists: %s" % (output + "\n" + errput)) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/utils.py", line 29, in expect | |
raise exc_type("%s %s" % (error_prefix,error_msg)) | |
Caught exception: ERROR: Fatal error in case.cmpgen_namelists: | |
ERROR: module command /opt/modules/default/bin/modulecmd python load cray-libsci/16.06.1 failed with message: | |
cray-libsci/16.06.1(183):ERROR:150: Module 'cray-libsci/16.06.1' conflicts with the currently loaded module(s) 'cray-libsci/16.11.1' | |
cray-libsci/16.06.1(183):ERROR:102: Tcl command execution failed: libsci::check_conflicts | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 575, in _run_catch_exceptions | |
return run(test) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 544, in _setup_phase | |
expect(cmdstat in [0, TESTS_FAILED_ERR_CODE], "Fatal error in case.cmpgen_namelists: %s" % (output + "\n" + errput)) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/utils.py", line 29, in expect | |
raise exc_type("%s %s" % (error_prefix,error_msg)) | |
====================================================================== | |
FAIL: test_bless_test_results (__main__.Q_TestBlessTestResults) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 958, in test_bless_test_results | |
self.simple_test(True, "%s -t %s-%s" % (genarg, self._baseline_name, CIME.utils.get_timestamp())) | |
File "./scripts_regression_tests.py", line 938, in simple_test | |
expected_stat=(0 if expect_works or self._hasbatch else CIME.utils.TESTS_FAILED_ERR_CODE)) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/create_test --test-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 --output-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 -g -o -b fake_testing_only_20170501_134028 TESTRUNDIFF_P1.f19_g16_rx1.A -t fake_testing_only_20170501_134028-20170501_134028 | |
FROM_DIR: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/tests | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 100 | |
OUTPUT: Using project from .cesm_proj: cli112 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNDIFF_P1.f19_g16_rx1.A.titan_pgi.G.fake_testing_only_20170501_134028-20170501_134028 | |
RUNNING TESTS: | |
TESTRUNDIFF_P1.f19_g16_rx1.A.titan_pgi | |
Starting CREATE_NEWCASE for test TESTRUNDIFF_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test TESTRUNDIFF_P1.f19_g16_rx1.A.titan_pgi in 3.218597 seconds (PASS) | |
Starting XML for test TESTRUNDIFF_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished XML for test TESTRUNDIFF_P1.f19_g16_rx1.A.titan_pgi in 0.882618 seconds (PASS) | |
Starting SETUP for test TESTRUNDIFF_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished SETUP for test TESTRUNDIFF_P1.f19_g16_rx1.A.titan_pgi in 9.062350 seconds (FAIL). [COMPLETED 1 of 1] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNDIFF_P1.f19_g16_rx1.A.titan_pgi.G.fake_testing_only_20170501_134028-20170501_134028 | |
Due to presence of batch system, create_test will exit before tests are complete. | |
To force create_test to wait for full completion, use --wait | |
At test-scheduler close, state is: | |
FAIL TESTRUNDIFF_P1.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNDIFF_P1.f19_g16_rx1.A.titan_pgi.G.fake_testing_only_20170501_134028-20170501_134028 | |
test-scheduler took 13.6280629635 seconds | |
ERRPUT: Caught exception: ERROR: Fatal error in case.cmpgen_namelists: | |
ERROR: module command /opt/modules/default/bin/modulecmd python load cray-libsci/16.06.1 failed with message: | |
cray-libsci/16.06.1(183):ERROR:150: Module 'cray-libsci/16.06.1' conflicts with the currently loaded module(s) 'cray-libsci/16.11.1' | |
cray-libsci/16.06.1(183):ERROR:102: Tcl command execution failed: libsci::check_conflicts | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 575, in _run_catch_exceptions | |
return run(test) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 544, in _setup_phase | |
expect(cmdstat in [0, TESTS_FAILED_ERR_CODE], "Fatal error in case.cmpgen_namelists: %s" % (output + "\n" + errput)) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/utils.py", line 29, in expect | |
raise exc_type("%s %s" % (error_prefix,error_msg)) | |
====================================================================== | |
FAIL: test_rebless_namelist (__main__.Q_TestBlessTestResults) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 1000, in test_rebless_namelist | |
self.simple_test(True, "%s -n -t %s-%s" % (genarg, self._baseline_name, CIME.utils.get_timestamp())) | |
File "./scripts_regression_tests.py", line 934, in simple_test | |
expected_stat=(0 if expect_works else CIME.utils.TESTS_FAILED_ERR_CODE)) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/create_test --test-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 --output-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 -g -o -b fake_testing_only_20170501_134042 cime_test_only_pass -n -t fake_testing_only_20170501_134042-20170501_134042 | |
FROM_DIR: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/tests | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 100 | |
OUTPUT: Using project from .cesm_proj: cli112 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi.G.fake_testing_only_20170501_134042-20170501_134042 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi.G.fake_testing_only_20170501_134042-20170501_134042 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.G.fake_testing_only_20170501_134042-20170501_134042 | |
RUNNING TESTS: | |
TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi | |
TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi | |
TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi | |
Starting CREATE_NEWCASE for test TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi in 2.976236 seconds (PASS) | |
Starting XML for test TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi in 3.025545 seconds (PASS) | |
Finished CREATE_NEWCASE for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi in 3.045010 seconds (PASS) | |
Starting XML for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting XML for test TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi with 1 procs | |
Finished XML for test TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi in 1.265015 seconds (PASS) | |
Finished XML for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi in 1.172382 seconds (PASS) | |
Starting SETUP for test TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi with 1 procs | |
Starting SETUP for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished XML for test TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi in 1.320679 seconds (PASS) | |
Starting SETUP for test TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi with 1 procs | |
Finished SETUP for test TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi in 5.423082 seconds (FAIL). [COMPLETED 1 of 3] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.G.fake_testing_only_20170501_134042-20170501_134042 | |
Finished SETUP for test TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi in 5.364271 seconds (FAIL). [COMPLETED 2 of 3] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi.G.fake_testing_only_20170501_134042-20170501_134042 | |
Finished SETUP for test TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi in 5.831543 seconds (FAIL). [COMPLETED 3 of 3] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi.G.fake_testing_only_20170501_134042-20170501_134042 | |
At test-scheduler close, state is: | |
FAIL TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f45_g37_rx1.A.titan_pgi.G.fake_testing_only_20170501_134042-20170501_134042 | |
FAIL TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.ne30_g16_rx1.A.titan_pgi.G.fake_testing_only_20170501_134042-20170501_134042 | |
FAIL TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/TESTRUNPASS_P1.f19_g16_rx1.A.titan_pgi.G.fake_testing_only_20170501_134042-20170501_134042 | |
test-scheduler took 10.2657248974 seconds | |
ERRPUT: Caught exception: ERROR: Fatal error in case.cmpgen_namelists: | |
ERROR: module command /opt/modules/default/bin/modulecmd python load cray-libsci/16.06.1 failed with message: | |
cray-libsci/16.06.1(183):ERROR:150: Module 'cray-libsci/16.06.1' conflicts with the currently loaded module(s) 'cray-libsci/16.11.1' | |
cray-libsci/16.06.1(183):ERROR:102: Tcl command execution failed: libsci::check_conflicts | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 575, in _run_catch_exceptions | |
return run(test) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 544, in _setup_phase | |
expect(cmdstat in [0, TESTS_FAILED_ERR_CODE], "Fatal error in case.cmpgen_namelists: %s" % (output + "\n" + errput)) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/utils.py", line 29, in expect | |
raise exc_type("%s %s" % (error_prefix,error_msg)) | |
Caught exception: ERROR: Fatal error in case.cmpgen_namelists: | |
ERROR: module command /opt/modules/default/bin/modulecmd python load cray-libsci/16.06.1 failed with message: | |
cray-libsci/16.06.1(183):ERROR:150: Module 'cray-libsci/16.06.1' conflicts with the currently loaded module(s) 'cray-libsci/16.11.1' | |
cray-libsci/16.06.1(183):ERROR:102: Tcl command execution failed: libsci::check_conflicts | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 575, in _run_catch_exceptions | |
return run(test) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 544, in _setup_phase | |
expect(cmdstat in [0, TESTS_FAILED_ERR_CODE], "Fatal error in case.cmpgen_namelists: %s" % (output + "\n" + errput)) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/utils.py", line 29, in expect | |
raise exc_type("%s %s" % (error_prefix,error_msg)) | |
Caught exception: ERROR: Fatal error in case.cmpgen_namelists: | |
ERROR: module command /opt/modules/default/bin/modulecmd python load cray-libsci/16.06.1 failed with message: | |
cray-libsci/16.06.1(183):ERROR:150: Module 'cray-libsci/16.06.1' conflicts with the currently loaded module(s) 'cray-libsci/16.11.1' | |
cray-libsci/16.06.1(183):ERROR:102: Tcl command execution failed: libsci::check_conflicts | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 575, in _run_catch_exceptions | |
return run(test) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/test_scheduler.py", line 544, in _setup_phase | |
expect(cmdstat in [0, TESTS_FAILED_ERR_CODE], "Fatal error in case.cmpgen_namelists: %s" % (output + "\n" + errput)) | |
File "/autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/Tools/../../scripts/lib/CIME/utils.py", line 29, in expect | |
raise exc_type("%s %s" % (error_prefix,error_msg)) | |
====================================================================== | |
FAIL: test_full_system (__main__.Z_FullSystemTest) | |
---------------------------------------------------------------------- | |
Traceback (most recent call last): | |
File "./scripts_regression_tests.py", line 1161, in test_full_system | |
run_cmd_assert_result(self, create_test_cmd) | |
File "./scripts_regression_tests.py", line 53, in run_cmd_assert_result | |
test_obj.assertEqual(stat, expected_stat, msg=msg) | |
AssertionError: | |
COMMAND: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/create_test cime_developer --test-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 --output-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 --walltime 0:15:00 -t fake_testing_only_20170501_134054 | |
FROM_DIR: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/tests | |
SHOULD HAVE WORKED, INSTEAD GOT STAT 100 | |
OUTPUT: Using project from .cesm_proj: cli112 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERR.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS.T42_T42.S.titan_pgi.fake_testing_only_20170501_134054 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS_D_Ln9.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SEQ_Ln9.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERS.ne30_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERI.f45_g37.X.titan_pgi.fake_testing_only_20170501_134054 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/DAE.f19_f19.A.titan_pgi.fake_testing_only_20170501_134054 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/NCK_Ld3.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERIO.f45_g37.X.titan_pgi.fake_testing_only_20170501_134054 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERS_N2.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
Creating test directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERP.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
RUNNING TESTS: | |
ERR.f45_g37_rx1.A.titan_pgi | |
SMS.T42_T42.S.titan_pgi | |
SMS_D_Ln9.f19_g16_rx1.A.titan_pgi | |
SEQ_Ln9.f19_g16_rx1.A.titan_pgi | |
ERS.ne30_g16_rx1.A.titan_pgi | |
ERI.f45_g37.X.titan_pgi | |
DAE.f19_f19.A.titan_pgi | |
NCK_Ld3.f45_g37_rx1.A.titan_pgi | |
ERIO.f45_g37.X.titan_pgi | |
ERS_N2.f19_g16_rx1.A.titan_pgi | |
ERP.f45_g37_rx1.A.titan_pgi | |
Starting CREATE_NEWCASE for test ERR.f45_g37_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test SMS.T42_T42.S.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test SMS_D_Ln9.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test SEQ_Ln9.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test ERS.ne30_g16_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test ERI.f45_g37.X.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test DAE.f19_f19.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test NCK_Ld3.f45_g37_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test ERIO.f45_g37.X.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test ERS_N2.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting CREATE_NEWCASE for test ERP.f45_g37_rx1.A.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test ERIO.f45_g37.X.titan_pgi in 2.837967 seconds (PASS) | |
Starting XML for test ERIO.f45_g37.X.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test ERR.f45_g37_rx1.A.titan_pgi in 3.150496 seconds (PASS) | |
Finished CREATE_NEWCASE for test SMS.T42_T42.S.titan_pgi in 3.152171 seconds (PASS) | |
Finished CREATE_NEWCASE for test ERI.f45_g37.X.titan_pgi in 3.151328 seconds (PASS) | |
Finished CREATE_NEWCASE for test ERS_N2.f19_g16_rx1.A.titan_pgi in 3.167497 seconds (PASS) | |
Finished CREATE_NEWCASE for test SMS_D_Ln9.f19_g16_rx1.A.titan_pgi in 3.201138 seconds (PASS) | |
Starting XML for test ERR.f45_g37_rx1.A.titan_pgi with 1 procs | |
Starting XML for test SMS.T42_T42.S.titan_pgi with 1 procs | |
Starting XML for test ERI.f45_g37.X.titan_pgi with 1 procs | |
Starting XML for test ERS_N2.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting XML for test SMS_D_Ln9.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished CREATE_NEWCASE for test ERS.ne30_g16_rx1.A.titan_pgi in 3.323462 seconds (PASS) | |
Finished CREATE_NEWCASE for test SEQ_Ln9.f19_g16_rx1.A.titan_pgi in 3.327798 seconds (PASS) | |
Finished CREATE_NEWCASE for test NCK_Ld3.f45_g37_rx1.A.titan_pgi in 3.327673 seconds (PASS) | |
Finished CREATE_NEWCASE for test DAE.f19_f19.A.titan_pgi in 3.331502 seconds (PASS) | |
Finished CREATE_NEWCASE for test ERP.f45_g37_rx1.A.titan_pgi in 3.355942 seconds (PASS) | |
Starting XML for test SEQ_Ln9.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting XML for test ERS.ne30_g16_rx1.A.titan_pgi with 1 procs | |
Starting XML for test DAE.f19_f19.A.titan_pgi with 1 procs | |
Starting XML for test NCK_Ld3.f45_g37_rx1.A.titan_pgi with 1 procs | |
Starting XML for test ERP.f45_g37_rx1.A.titan_pgi with 1 procs | |
Finished XML for test ERS_N2.f19_g16_rx1.A.titan_pgi in 5.140179 seconds (PASS) | |
Starting SETUP for test ERS_N2.f19_g16_rx1.A.titan_pgi with 1 procs | |
Finished XML for test ERIO.f45_g37.X.titan_pgi in 5.651304 seconds (PASS) | |
Starting SETUP for test ERIO.f45_g37.X.titan_pgi with 1 procs | |
Finished XML for test ERR.f45_g37_rx1.A.titan_pgi in 5.990485 seconds (PASS) | |
Starting SETUP for test ERR.f45_g37_rx1.A.titan_pgi with 1 procs | |
Finished XML for test SMS_D_Ln9.f19_g16_rx1.A.titan_pgi in 6.094541 seconds (PASS) | |
Finished XML for test SMS.T42_T42.S.titan_pgi in 6.145629 seconds (PASS) | |
Finished XML for test NCK_Ld3.f45_g37_rx1.A.titan_pgi in 5.921625 seconds (PASS) | |
Finished XML for test ERS.ne30_g16_rx1.A.titan_pgi in 5.974710 seconds (PASS) | |
Starting SETUP for test SMS.T42_T42.S.titan_pgi with 1 procs | |
Starting SETUP for test SMS_D_Ln9.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting SETUP for test ERS.ne30_g16_rx1.A.titan_pgi with 1 procs | |
Starting SETUP for test NCK_Ld3.f45_g37_rx1.A.titan_pgi with 1 procs | |
Finished XML for test DAE.f19_f19.A.titan_pgi in 6.139459 seconds (PASS) | |
Finished XML for test SEQ_Ln9.f19_g16_rx1.A.titan_pgi in 6.175391 seconds (PASS) | |
Starting SETUP for test SEQ_Ln9.f19_g16_rx1.A.titan_pgi with 1 procs | |
Starting SETUP for test DAE.f19_f19.A.titan_pgi with 1 procs | |
Finished XML for test ERI.f45_g37.X.titan_pgi in 8.748890 seconds (PASS) | |
Starting SETUP for test ERI.f45_g37.X.titan_pgi with 1 procs | |
Finished SETUP for test ERS_N2.f19_g16_rx1.A.titan_pgi in 3.792852 seconds (FAIL). [COMPLETED 1 of 11] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERS_N2.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
Finished SETUP for test ERIO.f45_g37.X.titan_pgi in 3.825232 seconds (FAIL). [COMPLETED 2 of 11] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERIO.f45_g37.X.titan_pgi.fake_testing_only_20170501_134054 | |
Finished SETUP for test ERR.f45_g37_rx1.A.titan_pgi in 4.044292 seconds (FAIL). [COMPLETED 3 of 11] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERR.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
Finished XML for test ERP.f45_g37_rx1.A.titan_pgi in 9.859257 seconds (PASS) | |
Starting SETUP for test ERP.f45_g37_rx1.A.titan_pgi with 1 procs | |
Finished SETUP for test SMS.T42_T42.S.titan_pgi in 4.112611 seconds (FAIL). [COMPLETED 4 of 11] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS.T42_T42.S.titan_pgi.fake_testing_only_20170501_134054 | |
Finished SETUP for test NCK_Ld3.f45_g37_rx1.A.titan_pgi in 4.082132 seconds (FAIL). [COMPLETED 5 of 11] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/NCK_Ld3.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
Finished SETUP for test SEQ_Ln9.f19_g16_rx1.A.titan_pgi in 4.134969 seconds (FAIL). [COMPLETED 6 of 11] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SEQ_Ln9.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
Finished SETUP for test ERS.ne30_g16_rx1.A.titan_pgi in 4.390896 seconds (FAIL). [COMPLETED 7 of 11] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERS.ne30_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
Finished SETUP for test DAE.f19_f19.A.titan_pgi in 4.186136 seconds (FAIL). [COMPLETED 8 of 11] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/DAE.f19_f19.A.titan_pgi.fake_testing_only_20170501_134054 | |
Finished SETUP for test SMS_D_Ln9.f19_g16_rx1.A.titan_pgi in 4.419777 seconds (FAIL). [COMPLETED 9 of 11] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS_D_Ln9.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
Finished SETUP for test ERI.f45_g37.X.titan_pgi in 3.433091 seconds (FAIL). [COMPLETED 10 of 11] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERI.f45_g37.X.titan_pgi.fake_testing_only_20170501_134054 | |
Finished SETUP for test ERP.f45_g37_rx1.A.titan_pgi in 3.403327 seconds (FAIL). [COMPLETED 11 of 11] Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERP.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
Due to presence of batch system, create_test will exit before tests are complete. | |
To force create_test to wait for full completion, use --wait | |
At test-scheduler close, state is: | |
FAIL ERR.f45_g37_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERR.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
FAIL SMS.T42_T42.S.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS.T42_T42.S.titan_pgi.fake_testing_only_20170501_134054 | |
FAIL SMS_D_Ln9.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SMS_D_Ln9.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
FAIL SEQ_Ln9.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/SEQ_Ln9.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
FAIL ERS.ne30_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERS.ne30_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
FAIL ERI.f45_g37.X.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERI.f45_g37.X.titan_pgi.fake_testing_only_20170501_134054 | |
FAIL DAE.f19_f19.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/DAE.f19_f19.A.titan_pgi.fake_testing_only_20170501_134054 | |
FAIL NCK_Ld3.f45_g37_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/NCK_Ld3.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
FAIL ERIO.f45_g37.X.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERIO.f45_g37.X.titan_pgi.fake_testing_only_20170501_134054 | |
FAIL ERS_N2.f19_g16_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERS_N2.f19_g16_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
FAIL ERP.f45_g37_rx1.A.titan_pgi (phase SETUP) | |
Case dir: /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERP.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
test-scheduler took 17.0072610378 seconds | |
ERRPUT: | |
---------------------------------------------------------------------- | |
Ran 99 tests in 457.774s | |
FAILED (failures=18, skipped=5) | |
>cat /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERR.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054/TestStatus.log | |
2017-05-01 13:40:58: CREATE_NEWCASE PASSED for test 'ERR.f45_g37_rx1.A.titan_pgi'. | |
Command: /autofs/nccs-svm1_home1/gb9/Projects/cime/scripts/create_newcase --case /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERR.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 --res f45_g37_rx1 --mach titan --compiler pgi --compset A --test --project cli112 --output-root /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334 --walltime 0:15:00 | |
Output: Compset longname is 2000_DATM%NYF_SLND_DICE%SSMI_DOCN%DOM_DROF%NYF_SGLC_SWAV | |
Compset specification file is /autofs/nccs-svm1_home1/gb9/Projects/cime/src/drivers/mct/cime_config/config_compsets.xml | |
Pes specification file is /autofs/nccs-svm1_home1/gb9/Projects/cime/src/drivers/mct/cime_config/config_pes.xml | |
Pes setting: grid is a%4x5_l%4x5_oi%gx3v7_r%rx1_m%gx3v7_g%null_w%null | |
Pes setting: compset is 2000_DATM%NYF_SLND_DICE%SSMI_DOCN%DOM_DROF%NYF_SGLC_SWAV | |
Pes setting: tasks is {'NTASKS_ATM': -1, 'NTASKS_ICE': -1, 'NTASKS_CPL': -1, 'NTASKS_LND': -1, 'NTASKS_WAV': -1, 'NTASKS_ROF': -1, 'NTASKS_OCN': -1, 'NTASKS_GLC': -1, 'NTASKS_ESP': -1} | |
Pes other settings: {'comment': 'none'} | |
Compset is: 2000_DATM%NYF_SLND_DICE%SSMI_DOCN%DOM_DROF%NYF_SGLC_SWAV | |
Grid is: a%4x5_l%4x5_oi%gx3v7_r%rx1_m%gx3v7_g%null_w%null | |
Components in compset are: ['datm', 'slnd', 'dice', 'docn', 'drof', 'sglc', 'swav', 'sesp', 'cpl'] | |
Creating Case directory /ccs/home/gb9/acme_scratch/cli115/scripts_regression_test.20170501_133334/ERR.f45_g37_rx1.A.titan_pgi.fake_testing_only_20170501_134054 | |
--------------------------------------------------- | |
2017-05-01 13:41:07: SETUP FAILED for test 'ERR.f45_g37_rx1.A.titan_pgi'. | |
Command: ./case.setup | |
Output: | |
Errput: ERROR: module command /opt/modules/default/bin/modulecmd python load cray-libsci/16.06.1 failed with message: | |
cray-libsci/16.06.1(183):ERROR:150: Module 'cray-libsci/16.06.1' conflicts with the currently loaded module(s) 'cray-libsci/16.11.1' | |
cray-libsci/16.06.1(183):ERROR:102: Tcl command execution failed: libsci::check_conflicts | |
--------------------------------------------------- | |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment