Skip to content

Instantly share code, notes, and snippets.

@ashkurti
Created February 26, 2015 14:21
Show Gist options
  • Save ashkurti/216a7c38a248301429b3 to your computer and use it in GitHub Desktop.
Save ashkurti/216a7c38a248301429b3 to your computer and use it in GitHub Desktop.
coco amber workflow - default files - extasy-0.1.3-beta
2015:02:26 14:09:08 radical.pilot.MainProcess: [INFO ] radical.pilot version: 0.23 (v0.23)
2015:02:26 14:09:08 24164 MainThread saga : [INFO ] saga-python version: v0.26
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.context.myproxy
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.context.myproxy for saga.Context API with URL scheme(s) ['myproxy://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.context.x509
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.context.x509 for saga.Context API with URL scheme(s) ['x509://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.context.ssh
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.context.ssh for saga.Context API with URL scheme(s) ['ssh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.context.userpass
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.context.userpass for saga.Context API with URL scheme(s) ['userpass://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.shell.shell_job
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.shell.shell_job for saga.job.Service API with URL scheme(s) ['fork://', 'local://', 'ssh://', 'gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.shell.shell_job for saga.job.Job API with URL scheme(s) ['fork://', 'local://', 'ssh://', 'gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.shell.shell_file
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.shell.shell_file for saga.namespace.Directory API with URL scheme(s) ['file://', 'local://', 'sftp://', 'gsisftp://', 'ssh://', 'gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.shell.shell_file for saga.namespace.Entry API with URL scheme(s) ['file://', 'local://', 'sftp://', 'gsisftp://', 'ssh://', 'gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.shell.shell_file for saga.filesystem.Directory API with URL scheme(s) ['file://', 'local://', 'sftp://', 'gsisftp://', 'ssh://', 'gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.shell.shell_file for saga.filesystem.File API with URL scheme(s) ['file://', 'local://', 'sftp://', 'gsisftp://', 'ssh://', 'gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.shell.shell_resource
2015:02:26 14:09:08 24164 MainThread saga.Engine : [WARNING ] Skipping adaptor saga.adaptors.shell.shell_resource: beta versions are disabled (v0.1.beta)
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.redis.redis_advert
2015:02:26 14:09:08 24164 MainThread saga.Engine : [WARNING ] Skipping adaptor saga.adaptors.redis.redis_advert 1: module loading failed: No module named redis
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.sge.sgejob
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.sge.sgejob for saga.job.Service API with URL scheme(s) ['sge://', 'sge+ssh://', 'sge+gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.sge.sgejob for saga.job.Job API with URL scheme(s) ['sge://', 'sge+ssh://', 'sge+gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.pbs.pbsjob
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.pbs.pbsjob for saga.job.Service API with URL scheme(s) ['pbs://', 'pbs+ssh://', 'pbs+gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.pbs.pbsjob for saga.job.Job API with URL scheme(s) ['pbs://', 'pbs+ssh://', 'pbs+gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.lsf.lsfjob
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.lsf.lsfjob for saga.job.Service API with URL scheme(s) ['lsf://', 'lsf+ssh://', 'lsf+gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.lsf.lsfjob for saga.job.Job API with URL scheme(s) ['lsf://', 'lsf+ssh://', 'lsf+gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.condor.condorjob
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.condor.condorjob for saga.job.Service API with URL scheme(s) ['condor://', 'condor+ssh://', 'condor+gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.condor.condorjob for saga.job.Job API with URL scheme(s) ['condor://', 'condor+ssh://', 'condor+gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.slurm.slurm_job
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.slurm.slurm_job for saga.job.Service API with URL scheme(s) ['slurm://', 'slurm+ssh://', 'slurm+gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.slurm.slurm_job for saga.job.Job API with URL scheme(s) ['slurm://', 'slurm+ssh://', 'slurm+gsissh://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.http.http_file
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.http.http_file for saga.namespace.Entry API with URL scheme(s) ['http://', 'https://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.http.http_file for saga.filesystem.File API with URL scheme(s) ['http://', 'https://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.aws.ec2_resource
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.aws.ec2_resource for saga.Context API with URL scheme(s) ['ec2://', 'ec2_keypair://', 'openstack://', 'eucalyptus://', 'euca://', 'aws://', 'amazon://', 'http://', 'https://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.aws.ec2_resource for saga.resource.Manager API with URL scheme(s) ['ec2://', 'ec2_keypair://', 'openstack://', 'eucalyptus://', 'euca://', 'aws://', 'amazon://', 'http://', 'https://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.aws.ec2_resource for saga.resource.Compute API with URL scheme(s) ['ec2://', 'ec2_keypair://', 'openstack://', 'eucalyptus://', 'euca://', 'aws://', 'amazon://', 'http://', 'https://']
2015:02:26 14:09:08 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.loadl.loadljob
2015:02:26 14:09:09 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.loadl.loadljob for saga.job.Service API with URL scheme(s) ['loadl://', 'loadl+ssh://', 'loadl+gsissh://']
2015:02:26 14:09:09 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.loadl.loadljob for saga.job.Job API with URL scheme(s) ['loadl://', 'loadl+ssh://', 'loadl+gsissh://']
2015:02:26 14:09:09 24164 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.globus_online.go_file
2015:02:26 14:09:09 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.globus_online.go_file for saga.namespace.Directory API with URL scheme(s) ['go://']
2015:02:26 14:09:09 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.globus_online.go_file for saga.namespace.Entry API with URL scheme(s) ['go://']
2015:02:26 14:09:09 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.globus_online.go_file for saga.filesystem.Directory API with URL scheme(s) ['go://']
2015:02:26 14:09:09 24164 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.globus_online.go_file for saga.filesystem.File API with URL scheme(s) ['go://']
2015:02:26 14:09:09 24164 MainThread saga.saga.adaptor.ssh : [INFO ] ignore ssh key at /users/ardita/.ssh/known_hosts.old (no public key: /users/ardita/.ssh/known_hosts.old.pub)
2015:02:26 14:09:09 24164 MainThread saga.saga.adaptor.ssh : [INFO ] ignore ssh key at /users/ardita/.ssh/k.dat (no public key: /users/ardita/.ssh/k.dat.pub)
2015:02:26 14:09:09 24164 MainThread saga.saga.adaptor.ssh : [INFO ] default ssh key at /users/ardita/.ssh/id_rsa
2015:02:26 14:09:09 24164 MainThread saga.saga.adaptor.ssh : [INFO ] ignore ssh key at /users/ardita/.ssh/environment (no public key: /users/ardita/.ssh/environment.pub)
2015:02:26 14:09:09 24164 MainThread saga.saga.adaptor.ssh : [INFO ] ignore ssh key at /users/ardita/.ssh/known_hosts (no public key: /users/ardita/.ssh/known_hosts.pub)
2015:02:26 14:09:09 24164 MainThread saga.saga.adaptor.ssh : [INFO ] ignore ssh key at /users/ardita/.ssh/authorized_keys (no public key: /users/ardita/.ssh/authorized_keys.pub)
2015:02:26 14:09:09 24164 MainThread saga.saga.adaptor.ssh : [INFO ] ignore ssh key at /users/ardita/.ssh/config (no public key: /users/ardita/.ssh/config.pub)
2015:02:26 14:09:09 24164 MainThread saga.ContextSSH : [INFO ] init SSH context for key at '/users/ardita/.ssh/id_rsa' done
2015:02:26 14:09:09 24164 MainThread saga.DefaultSession : [DEBUG ] default context [saga.adaptor.ssh ] : {'LifeTime' : '-1', 'Type' : 'ssh', 'UserCert' : '/users/ardita/.ssh/id_rsa.pub', 'UserKey' : '/users/ardita/.ssh/id_rsa'}
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] using database url mongodb://extasy:extasyproject@extasy-db.epcc.ed.ac.uk/radicalpilot
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] using database path radicalpilot
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for nersc.hopper
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for futuregrid.sierra
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for futuregrid.alamo
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for futuregrid.hotel
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for futuregrid.india
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for lrz.supermuc
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for epsrc.archer
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for rice.biou
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for rice.davinci
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for local.localhost
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for radical.tutorial
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for das4.fs2
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for ncar.yellowstone
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for iu.bigred2
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for iu.quarry
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for xsede.stampede
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for xsede.trestles
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for xsede.gordon
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for xsede.blacklight
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for xsede.lonestar
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] New Session created{'uid': '54ef2905f8cdba5e6481a57c', 'created': datetime.datetime(2015, 2, 26, 14, 9, 9, 302253), 'database_auth': 'extasy:extasyproject', 'database_name': 'radicalpilot', 'database_url': 'mongodb://extasy:extasyproject@extasy-db.epcc.ed.ac.uk/radicalpilot', 'last_reconnect': None}.
2015:02:26 14:09:09 radical.pilot.MainProcess: [DEBUG ] Worker thread (ID: Thread-1[140320272545536]) for PilotManager 54ef2905f8cdba5e6481a57d started.
2015:02:26 14:09:09 radical.pilot.MainProcess: [WARNING ] using alias 'xsede.stampede' for deprecated resource key 'stampede.tacc.utexas.edu'
2015:02:26 14:09:09 radical.pilot.MainProcess: [DEBUG ] saga.utils.PTYShell ('sftp://stampede.tacc.utexas.edu/')
2015:02:26 14:09:09 radical.pilot.MainProcess: [DEBUG ] PTYShell init <saga.utils.pty_shell.PTYShell object at 0x7f9ee15a7350>
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] PTY prompt pattern: [\$#%>\]]\s*$
2015:02:26 14:09:09 radical.pilot.MainProcess: [DEBUG ] open master pty for [ssh] [ardi@stampede.tacc.utexas.edu] ardi: /usr/bin/env TERM=vt100 "/usr/bin/ssh" -t -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=no -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu'
2015:02:26 14:09:09 radical.pilot.MainProcess: [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f9ee20cca10>
2015:02:26 14:09:09 radical.pilot.MainProcess: [INFO ] running: /usr/bin/env TERM=vt100 /usr/bin/ssh -t -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=no -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu
2015:02:26 14:09:09 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 33] ( export PS1='$' ; set prompt='$'\n)
2015:02:26 14:09:09 radical.pilot.MainProcess: [DEBUG ] Connected to MongoDB. Serving requests for PilotManager 54ef2905f8cdba5e6481a57d.
2015:02:26 14:09:10 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 51] ( export PS1='$' > /dev/null 2>&1 || set prompt='$'\n)
2015:02:26 14:09:10 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 28] ( printf 'HELLO_%d_SAGA\n' 1\n)
2015:02:26 14:09:11 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 51] ( export PS1='$' > /dev/null 2>&1 || set prompt='$'\n)
2015:02:26 14:09:11 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 28] ( printf 'HELLO_%d_SAGA\n' 2\n)
2015:02:26 14:09:12 radical.pilot.MainProcess: [DEBUG ] read : [ 5] [ 2396] (Last login: Thu Feb 26 08:00:3 ... __________________________\n\n)
2015:02:26 14:09:12 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (Last login: Thu Feb 26 08:00:38 2015 from poirot.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2015:02:26 14:09:12 radical.pilot.MainProcess: [DEBUG ] waiting for prompt trigger HELLO_2_SAGA: (5) (Last login: Thu Feb 26 08:00:38 2015 from poirot.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2015:02:26 14:09:13 radical.pilot.MainProcess: [DEBUG ] read : [ 5] [ 244] (----------------------- Project balances for user ardi ------------------------\n| Name Avail SUs Expires | Name Avail SUs Expires |\n| TG-MCB090174 138809 2015-09-30 | TG-TRA140016 -81080 2015-05-06 | \n)
2015:02:26 14:09:13 radical.pilot.MainProcess: [DEBUG ] read : [ 5] [ 405] (-------------------------- Disk quotas for user ardi --------------------------\n| Disk Usage (GB) Limit %Used File Usage Limit %Used |\n| /home1 0.1 5.0 2.37 5069 150000 3.38 |\n| /work 9.3 1024.0 0.91 619828 3000000 20.66 |\n-------------------------------------------------------------------------------\n)
2015:02:26 14:09:13 radical.pilot.MainProcess: [DEBUG ] read : [ 5] [ 252] (\nTip 4 (See "module help tacc_tips" for features or how to disable)\n\n Use "!!" to repeat the most recent command. Use "!mpi" to repeat the most recent command that started with "mpi" and "!?mpi?" for the most recent one that contained "mpi".\n\n)
2015:02:26 14:09:13 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 28] ( printf 'HELLO_%d_SAGA\n' 2\n)
2015:02:26 14:09:13 radical.pilot.MainProcess: [DEBUG ] read : [ 5] [ 53] (login1.stampede(1)$ $$HELLO_1_SAGA\n$$HELLO_2_SAGA\n$)
2015:02:26 14:09:13 radical.pilot.MainProcess: [DEBUG ] got shell prompt trigger (4) (
See "man slurm" or the Stampede user guide for more detailed information.
--> To see all the software that is available across all compilers and
mpi stacks, issue: "module spider"
--> To see which software packages are available with your currently loaded
compiler and mpi stack, issue: "module avail"
--> Stampede has three parallel file systems: $HOME (permanent,
quota'd, backed-up) $WORK (permanent, quota'd, not backed-up) and
$SCRATCH (high-speed purged storage). The "cdw" and "cds" aliases
are provided as a convenience to change to your $WORK and $SCRATCH
directories, respectively.
______________________________________________________________________________
----------------------- Project balances for user ardi ------------------------
| Name Avail SUs Expires | Name Avail SUs Expires |
| TG-MCB090174 138809 2015-09-30 | TG-TRA140016 -81080 2015-05-06 |
-------------------------- Disk quotas for user ardi --------------------------
| Disk Usage (GB) Limit %Used File Usage Limit %Used |
| /home1 0.1 5.0 2.37 5069 150000 3.38 |
| /work 9.3 1024.0 0.91 619828 3000000 20.66 |
-------------------------------------------------------------------------------
Tip 4 (See "module help tacc_tips" for features or how to disable)
Use "!!" to repeat the most recent command. Use "!mpi" to repeat the most recent command that started with "mpi" and "!?mpi?" for the most recent one that contained "mpi".
login1.stampede(1)$ $$HELLO_1_SAGA
$$HELLO_2_SAGA)
2015:02:26 14:09:13 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (
$)
2015:02:26 14:09:13 radical.pilot.MainProcess: [DEBUG ] Got initial shell prompt (5) (
$)
2015:02:26 14:09:13 radical.pilot.MainProcess: [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f9ee15a7810>
2015:02:26 14:09:13 radical.pilot.MainProcess: [INFO ] running: /usr/bin/env TERM=vt100 /usr/bin/ssh -t -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=no -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu
2015:02:26 14:09:13 radical.pilot.MainProcess: [DEBUG ] write: [ 6] [ 33] ( export PS1='$' ; set prompt='$'\n)
2015:02:26 14:09:14 radical.pilot.MainProcess: [DEBUG ] write: [ 6] [ 51] ( export PS1='$' > /dev/null 2>&1 || set prompt='$'\n)
2015:02:26 14:09:14 radical.pilot.MainProcess: [DEBUG ] write: [ 6] [ 28] ( printf 'HELLO_%d_SAGA\n' 1\n)
2015:02:26 14:09:15 radical.pilot.MainProcess: [DEBUG ] write: [ 6] [ 51] ( export PS1='$' > /dev/null 2>&1 || set prompt='$'\n)
2015:02:26 14:09:15 radical.pilot.MainProcess: [DEBUG ] write: [ 6] [ 28] ( printf 'HELLO_%d_SAGA\n' 2\n)
2015:02:26 14:09:16 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 2396] (Last login: Thu Feb 26 08:00:5 ... __________________________\n\n)
2015:02:26 14:09:16 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (Last login: Thu Feb 26 08:00:59 2015 from poirot.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2015:02:26 14:09:16 radical.pilot.MainProcess: [DEBUG ] waiting for prompt trigger HELLO_2_SAGA: (5) (Last login: Thu Feb 26 08:00:59 2015 from poirot.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] write: [ 6] [ 28] ( printf 'HELLO_%d_SAGA\n' 2\n)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 244] (----------------------- Project balances for user ardi ------------------------\n| Name Avail SUs Expires | Name Avail SUs Expires |\n| TG-MCB090174 138809 2015-09-30 | TG-TRA140016 -81080 2015-05-06 | \n)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 405] (-------------------------- Disk quotas for user ardi --------------------------\n| Disk Usage (GB) Limit %Used File Usage Limit %Used |\n| /home1 0.1 5.0 2.37 5069 150000 3.38 |\n| /work 9.3 1024.0 0.91 619828 3000000 20.66 |\n-------------------------------------------------------------------------------\n)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 123] (\nTip 102 (See "module help tacc_tips" for features or how to disable)\n\n To delete the previous word use "Ctrl+W"\n\n)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 37] (login4.stampede(1)$ $$HELLO_1_SAGA\n$)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] got shell prompt trigger (4) (
See "man slurm" or the Stampede user guide for more detailed information.
--> To see all the software that is available across all compilers and
mpi stacks, issue: "module spider"
--> To see which software packages are available with your currently loaded
compiler and mpi stack, issue: "module avail"
--> Stampede has three parallel file systems: $HOME (permanent,
quota'd, backed-up) $WORK (permanent, quota'd, not backed-up) and
$SCRATCH (high-speed purged storage). The "cdw" and "cds" aliases
are provided as a convenience to change to your $WORK and $SCRATCH
directories, respectively.
______________________________________________________________________________
----------------------- Project balances for user ardi ------------------------
| Name Avail SUs Expires | Name Avail SUs Expires |
| TG-MCB090174 138809 2015-09-30 | TG-TRA140016 -81080 2015-05-06 |
-------------------------- Disk quotas for user ardi --------------------------
| Disk Usage (GB) Limit %Used File Usage Limit %Used |
| /home1 0.1 5.0 2.37 5069 150000 3.38 |
| /work 9.3 1024.0 0.91 619828 3000000 20.66 |
-------------------------------------------------------------------------------
Tip 102 (See "module help tacc_tips" for features or how to disable)
To delete the previous word use "Ctrl+W"
login4.stampede(1)$ $$HELLO_1_SAGA)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (
$)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] waiting for prompt trigger HELLO_2_SAGA: (5) (
$)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 31] ($HELLO_2_SAGA\n$HELLO_2_SAGA\n$)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] got shell prompt trigger (4) ($HELLO_2_SAGA
$HELLO_2_SAGA)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (
$)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] Got initial shell prompt (5) (
$)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] running command shell: exec /bin/sh -i
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] write: [ 6] [ 47] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 1] ($)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] flush: [ 6] [ ] (flush pty read cache)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] write: [ 6] [ 100] ( unset PROMPT_COMMAND ; unset HISTFILE ; PS1='PROMPT-$?->'; PS2=''; export PS1 PS2 2>&1 >/dev/null\n)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 10] (PROMPT-0->)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] got new shell prompt
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] flush: [ 6] [ ] (flush pty read cache)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] flush: [ 6] [ ] (flush pty read cache)
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] run_sync: echo "WORKDIR: $WORK"
2015:02:26 14:09:17 radical.pilot.MainProcess: [DEBUG ] write: [ 6] [ 22] (echo "WORKDIR: $WORK"\n)
2015:02:26 14:09:18 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 37] (WORKDIR: /work/02998/ardi\nPROMPT-0->)
2015:02:26 14:09:18 radical.pilot.MainProcess: [DEBUG ] Determined remote working directory for sftp://stampede.tacc.utexas.edu/: '/work/02998/ardi'
2015:02:26 14:09:18 radical.pilot.MainProcess: [DEBUG ] PTYShell del <saga.utils.pty_shell.PTYShell object at 0x7f9ee15a7350>
2015:02:26 14:09:18 radical.pilot.MainProcess: [INFO ] Launching ComputePilot {u'state': u'PendingLaunch', u'commands': [], u'description': {u'project': u'TG-MCB090174', u'resource': u'stampede.tacc.utexas.edu', u'queue': u'normal', u'sandbox': None, u'cleanup': None, u'pilot_agent_priv': None, u'access_schema': None, u'memory': None, u'cores': 16, u'runtime': 60}, u'sagajobid': None, u'started': None, u'cores_per_node': None, u'output_transfer_started': None, u'finished': None, u'submitted': datetime.datetime(2015, 2, 26, 14, 9, 18, 104000), u'output_transfer_finished': None, u'sandbox': u'sftp://stampede.tacc.utexas.edu/work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/', u'pilotmanager': u'54ef2905f8cdba5e6481a57d', u'unitmanager': None, u'heartbeat': None, u'statehistory': [{u'timestamp': datetime.datetime(2015, 2, 26, 14, 9, 18, 103000), u'state': u'PendingLaunch'}], u'input_transfer_started': None, u'_id': ObjectId('54ef2905f8cdba5e6481a57e'), u'input_transfer_finished': None, u'nodes': None, u'log': []}
2015:02:26 14:09:18 radical.pilot.MainProcess: [WARNING ] using alias 'xsede.stampede' for deprecated resource key 'stampede.tacc.utexas.edu'
2015:02:26 14:09:18 radical.pilot.MainProcess: [INFO ] Using pilot agent /users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py
2015:02:26 14:09:18 radical.pilot.MainProcess: [INFO ] Using bootstrapper /users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh
2015:02:26 14:09:18 radical.pilot.MainProcess: [DEBUG ] Copying bootstrapper 'file://localhost//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh' to agent sandbox (sftp://stampede.tacc.utexas.edu/work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e//default_bootstrapper.sh).
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [INFO ] init_instance file://localhost//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] PTYShell init <saga.utils.pty_shell.PTYShell object at 0x7f9ee025f090>
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [INFO ] PTY prompt pattern: [\$#%>\]]\s*$
2015:02:26 14:09:18 radical.pilot.MainProcess: [DEBUG ] PTYProcess del <saga.utils.pty_process.PTYProcess object at 0x7f9ee15a7810>
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] open master pty for [sh] [localhost] ardita: /usr/bin/env TERM=vt100 "/bin/tcsh" -i'
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f9ee2138950>
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [INFO ] running: /usr/bin/env TERM=vt100 /bin/tcsh -i
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 6] [ 33] ( export PS1='$' ; set prompt='$'\n)
2015:02:26 14:09:18 radical.pilot.MainProcess: [INFO ] Starting InputFileTransferWorker
2015:02:26 14:09:18 radical.pilot.MainProcess: [INFO ] Starting InputFileTransferWorker
2015:02:26 14:09:18 radical.pilot.MainProcess: [DEBUG ] Worker thread (ID: Thread-3[140320220096256]) for UnitManager 54ef290ef8cdba5e6481a57f started.
2015:02:26 14:09:18 radical.pilot.MainProcess: [INFO ] Loaded scheduler: DirectSubmissionScheduler.
2015:02:26 14:09:18 radical.pilot.MainProcess: [ERROR ] session 54ef2905f8cdba5e6481a57c closing
2015:02:26 14:09:18 radical.pilot.MainProcess: [ERROR ] session 54ef2905f8cdba5e6481a57c closes pmgr 54ef2905f8cdba5e6481a57d
2015:02:26 14:09:18 radical.pilot.MainProcess: [ERROR ] pmgr 54ef2905f8cdba5e6481a57d closing
2015:02:26 14:09:18 radical.pilot.MainProcess: [ERROR ] pmgr 54ef2905f8cdba5e6481a57d cancel worker Thread-1
2015:02:26 14:09:18 radical.pilot.MainProcess: [ERROR ] pworker Thread-1 stops launcher PilotLauncherWorker-1
2015:02:26 14:09:18 radical.pilot.MainProcess: [ERROR ] launcher PilotLauncherWorker-1 stopping
2015:02:26 14:09:18 radical.pilot.MainProcess: [DEBUG ] Connected to MongoDB. Serving requests for UnitManager 54ef290ef8cdba5e6481a57f.
2015:02:26 14:09:18 radical.pilot.MainProcess: [DEBUG ] Connected to MongoDB. Serving requests for UnitManager 54ef290ef8cdba5e6481a57f.
2015:02:26 14:09:18 radical.pilot.MainProcess: [DEBUG ] Connected to MongoDB. Serving requests for UnitManager 54ef290ef8cdba5e6481a57f.
2015:02:26 14:09:18 radical.pilot.MainProcess: [DEBUG ] Connected to MongoDB. Serving requests for UnitManager 54ef290ef8cdba5e6481a57f.
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 3] ( \n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 45] (Welcome to poirot.pharm.nottingham.ac.uk\n \n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 46] (Operating System: CentOS release 6.6 (Final)\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 94] (CPU: GenuineIntel Intel(R) Core(TM) i7-3820 CPU @ 3.60GHz\n 8 3601 MHz x86_64 Processors\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 63] (Main memory size: 15874 Mbytes\nSwap memory size: 8007 Mbytes\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 47] (Removable media:\n /dev/sr0: CDDVDW SH-222BB\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 3] ( \n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 244] (Disk quotas for user ardita (uid 1285): \n Filesystem blocks quota limit grace files quota limit grace\nlund.pharm.nottingham.ac.uk:/users/\n 394G 400G 1812G 115k 0 0 \n \n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 31] (Currently Loaded Modulefiles:\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 73] ( 1) sysinit 2) intel/14.0.3 3) python/2.7.8 4) geany/1.24.1\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 3] ( \n)
2015:02:26 14:09:18 radical.pilot.MainProcess: [INFO ] ComputePilot '54ef2905f8cdba5e6481a57e' state changed from 'PendingLaunch' to 'Launching'.
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 41] (Condor queue: 0 jobs currently in queue\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 131] ( \nFor information about this machine and software available on it go to:\n http://holmes.cancres.nottingham.ac.uk/facilities/\n \n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 82] (ardita@poirot 101% export PS1='$' ; set prompt='$'\nexport: Command not found.\n$)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] got initial shell prompt (5) (
Welcome to poirot.pharm.nottingham.ac.uk
Operating System: CentOS release 6.6 (Final)
CPU: GenuineIntel Intel(R) Core(TM) i7-3820 CPU @ 3.60GHz
8 3601 MHz x86_64 Processors
Main memory size: 15874 Mbytes
Swap memory size: 8007 Mbytes
Removable media:
/dev/sr0: CDDVDW SH-222BB
Disk quotas for user ardita (uid 1285):
Filesystem blocks quota limit grace files quota limit grace
lund.pharm.nottingham.ac.uk:/users/
394G 400G 1812G 115k 0 0
Currently Loaded Modulefiles:
1) sysinit 2) intel/14.0.3 3) python/2.7.8 4) geany/1.24.1
Condor queue: 0 jobs currently in queue
For information about this machine and software available on it go to:
http://holmes.cancres.nottingham.ac.uk/facilities/
ardita@poirot 101% export PS1='$' ; set prompt='$'
export: Command not found.
$)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] Got initial shell prompt (5) (
Welcome to poirot.pharm.nottingham.ac.uk
Operating System: CentOS release 6.6 (Final)
CPU: GenuineIntel Intel(R) Core(TM) i7-3820 CPU @ 3.60GHz
8 3601 MHz x86_64 Processors
Main memory size: 15874 Mbytes
Swap memory size: 8007 Mbytes
Removable media:
/dev/sr0: CDDVDW SH-222BB
Disk quotas for user ardita (uid 1285):
Filesystem blocks quota limit grace files quota limit grace
lund.pharm.nottingham.ac.uk:/users/
394G 400G 1812G 115k 0 0
Currently Loaded Modulefiles:
1) sysinit 2) intel/14.0.3 3) python/2.7.8 4) geany/1.24.1
Condor queue: 0 jobs currently in queue
For information about this machine and software available on it go to:
http://holmes.cancres.nottingham.ac.uk/facilities/
ardita@poirot 101% export PS1='$' ; set prompt='$'
export: Command not found.
$)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f9ee0279510>
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [INFO ] running: /usr/bin/env TERM=vt100 /bin/tcsh -i
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 10] [ 33] ( export PS1='$' ; set prompt='$'\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 3] ( \n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 45] (Welcome to poirot.pharm.nottingham.ac.uk\n \n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 46] (Operating System: CentOS release 6.6 (Final)\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 94] (CPU: GenuineIntel Intel(R) Core(TM) i7-3820 CPU @ 3.60GHz\n 8 3601 MHz x86_64 Processors\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 63] (Main memory size: 15874 Mbytes\nSwap memory size: 8007 Mbytes\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 18] (Removable media:\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 29] ( /dev/sr0: CDDVDW SH-222BB\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 3] ( \n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 244] (Disk quotas for user ardita (uid 1285): \n Filesystem blocks quota limit grace files quota limit grace\nlund.pharm.nottingham.ac.uk:/users/\n 394G 400G 1812G 115k 0 0 \n \n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 31] (Currently Loaded Modulefiles:\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 73] ( 1) sysinit 2) intel/14.0.3 3) python/2.7.8 4) geany/1.24.1\n)
2015:02:26 14:09:18 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 3] ( \n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 41] (Condor queue: 0 jobs currently in queue\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 131] ( \nFor information about this machine and software available on it go to:\n http://holmes.cancres.nottingham.ac.uk/facilities/\n \n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 82] (ardita@poirot 101% export PS1='$' ; set prompt='$'\nexport: Command not found.\n$)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] got initial shell prompt (5) (
Welcome to poirot.pharm.nottingham.ac.uk
Operating System: CentOS release 6.6 (Final)
CPU: GenuineIntel Intel(R) Core(TM) i7-3820 CPU @ 3.60GHz
8 3601 MHz x86_64 Processors
Main memory size: 15874 Mbytes
Swap memory size: 8007 Mbytes
Removable media:
/dev/sr0: CDDVDW SH-222BB
Disk quotas for user ardita (uid 1285):
Filesystem blocks quota limit grace files quota limit grace
lund.pharm.nottingham.ac.uk:/users/
394G 400G 1812G 115k 0 0
Currently Loaded Modulefiles:
1) sysinit 2) intel/14.0.3 3) python/2.7.8 4) geany/1.24.1
Condor queue: 0 jobs currently in queue
For information about this machine and software available on it go to:
http://holmes.cancres.nottingham.ac.uk/facilities/
ardita@poirot 101% export PS1='$' ; set prompt='$'
export: Command not found.
$)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] Got initial shell prompt (5) (
Welcome to poirot.pharm.nottingham.ac.uk
Operating System: CentOS release 6.6 (Final)
CPU: GenuineIntel Intel(R) Core(TM) i7-3820 CPU @ 3.60GHz
8 3601 MHz x86_64 Processors
Main memory size: 15874 Mbytes
Swap memory size: 8007 Mbytes
Removable media:
/dev/sr0: CDDVDW SH-222BB
Disk quotas for user ardita (uid 1285):
Filesystem blocks quota limit grace files quota limit grace
lund.pharm.nottingham.ac.uk:/users/
394G 400G 1812G 115k 0 0
Currently Loaded Modulefiles:
1) sysinit 2) intel/14.0.3 3) python/2.7.8 4) geany/1.24.1
Condor queue: 0 jobs currently in queue
For information about this machine and software available on it go to:
http://holmes.cancres.nottingham.ac.uk/facilities/
ardita@poirot 101% export PS1='$' ; set prompt='$'
export: Command not found.
$)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] running command shell: exec /bin/sh -i
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 10] [ 47] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 48] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 8] (sh-4.1$ )
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] flush: [ 10] [ ] (flush pty read cache)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 10] [ 100] ( unset PROMPT_COMMAND ; unset HISTFILE ; PS1='PROMPT-$?->'; PS2=''; export PS1 PS2 2>&1 >/dev/null\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 10] (PROMPT-0->)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] got new shell prompt
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] flush: [ 10] [ ] (flush pty read cache)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] run_sync: cd /users/ardita/coam-on-stampede_26Feb
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 10] [ 40] (cd /users/ardita/coam-on-stampede_26Feb\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 10] (PROMPT-0->)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] flush: [ 10] [ ] (flush pty read cache)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] flush: [ 10] [ ] (flush pty read cache)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] run_sync: true; test -r '//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh'
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 10] [ 123] (true; test -r '//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh'\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 10] (PROMPT-0->)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [INFO ] file initialized (0)()
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] PTYShell init <saga.utils.pty_shell.PTYShell object at 0x7f9ee02798d0>
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [INFO ] PTY prompt pattern: [\$#%>\]]\s*$
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f9ee20f9e10>
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [INFO ] running: /usr/bin/env TERM=vt100 /bin/tcsh -i
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 11] [ 33] ( export PS1='$' ; set prompt='$'\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 3] ( \n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 45] (Welcome to poirot.pharm.nottingham.ac.uk\n \n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 46] (Operating System: CentOS release 6.6 (Final)\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 94] (CPU: GenuineIntel Intel(R) Core(TM) i7-3820 CPU @ 3.60GHz\n 8 3601 MHz x86_64 Processors\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 63] (Main memory size: 15874 Mbytes\nSwap memory size: 8007 Mbytes\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 18] (Removable media:\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 29] ( /dev/sr0: CDDVDW SH-222BB\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 3] ( \n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 244] (Disk quotas for user ardita (uid 1285): \n Filesystem blocks quota limit grace files quota limit grace\nlund.pharm.nottingham.ac.uk:/users/\n 394G 400G 1812G 115k 0 0 \n \n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 31] (Currently Loaded Modulefiles:\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 73] ( 1) sysinit 2) intel/14.0.3 3) python/2.7.8 4) geany/1.24.1\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 3] ( \n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 41] (Condor queue: 0 jobs currently in queue\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 131] ( \nFor information about this machine and software available on it go to:\n http://holmes.cancres.nottingham.ac.uk/facilities/\n \n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 82] (ardita@poirot 101% export PS1='$' ; set prompt='$'\nexport: Command not found.\n$)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] got initial shell prompt (5) (
Welcome to poirot.pharm.nottingham.ac.uk
Operating System: CentOS release 6.6 (Final)
CPU: GenuineIntel Intel(R) Core(TM) i7-3820 CPU @ 3.60GHz
8 3601 MHz x86_64 Processors
Main memory size: 15874 Mbytes
Swap memory size: 8007 Mbytes
Removable media:
/dev/sr0: CDDVDW SH-222BB
Disk quotas for user ardita (uid 1285):
Filesystem blocks quota limit grace files quota limit grace
lund.pharm.nottingham.ac.uk:/users/
394G 400G 1812G 115k 0 0
Currently Loaded Modulefiles:
1) sysinit 2) intel/14.0.3 3) python/2.7.8 4) geany/1.24.1
Condor queue: 0 jobs currently in queue
For information about this machine and software available on it go to:
http://holmes.cancres.nottingham.ac.uk/facilities/
ardita@poirot 101% export PS1='$' ; set prompt='$'
export: Command not found.
$)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] Got initial shell prompt (5) (
Welcome to poirot.pharm.nottingham.ac.uk
Operating System: CentOS release 6.6 (Final)
CPU: GenuineIntel Intel(R) Core(TM) i7-3820 CPU @ 3.60GHz
8 3601 MHz x86_64 Processors
Main memory size: 15874 Mbytes
Swap memory size: 8007 Mbytes
Removable media:
/dev/sr0: CDDVDW SH-222BB
Disk quotas for user ardita (uid 1285):
Filesystem blocks quota limit grace files quota limit grace
lund.pharm.nottingham.ac.uk:/users/
394G 400G 1812G 115k 0 0
Currently Loaded Modulefiles:
1) sysinit 2) intel/14.0.3 3) python/2.7.8 4) geany/1.24.1
Condor queue: 0 jobs currently in queue
For information about this machine and software available on it go to:
http://holmes.cancres.nottingham.ac.uk/facilities/
ardita@poirot 101% export PS1='$' ; set prompt='$'
export: Command not found.
$)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] running command shell: exec /bin/sh -i
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 11] [ 47] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 48] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 8] (sh-4.1$ )
2015:02:26 14:09:19 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] flush: [ 11] [ ] (flush pty read cache)
2015:02:26 14:09:20 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 11] [ 100] ( unset PROMPT_COMMAND ; unset HISTFILE ; PS1='PROMPT-$?->'; PS2=''; export PS1 PS2 2>&1 >/dev/null\n)
2015:02:26 14:09:20 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 10] (PROMPT-0->)
2015:02:26 14:09:20 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] got new shell prompt
2015:02:26 14:09:20 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] flush: [ 11] [ ] (flush pty read cache)
2015:02:26 14:09:20 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] run_sync: cd /users/ardita/coam-on-stampede_26Feb
2015:02:26 14:09:20 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 11] [ 40] (cd /users/ardita/coam-on-stampede_26Feb\n)
2015:02:26 14:09:20 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 11] [ 10] (PROMPT-0->)
2015:02:26 14:09:20 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] flush: [ 11] [ ] (flush pty read cache)
2015:02:26 14:09:20 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] PTYShell init <saga.utils.pty_shell.PTYShell object at 0x7f9ee0279d90>
2015:02:26 14:09:20 24164 PilotLauncherWorker-1 saga.ShellFile : [INFO ] PTY prompt pattern: [\$#%>\]]\s*$
2015:02:26 14:09:20 radical.pilot.MainProcess: [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f9ee2129790>
2015:02:26 14:09:20 radical.pilot.MainProcess: [INFO ] running: /usr/bin/env TERM=vt100 /usr/bin/ssh -t -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=no -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu
2015:02:26 14:09:20 radical.pilot.MainProcess: [DEBUG ] write: [ 12] [ 33] ( export PS1='$' ; set prompt='$'\n)
2015:02:26 14:09:21 radical.pilot.MainProcess: [DEBUG ] write: [ 12] [ 51] ( export PS1='$' > /dev/null 2>&1 || set prompt='$'\n)
2015:02:26 14:09:21 radical.pilot.MainProcess: [DEBUG ] write: [ 12] [ 28] ( printf 'HELLO_%d_SAGA\n' 1\n)
2015:02:26 14:09:22 radical.pilot.MainProcess: [DEBUG ] write: [ 12] [ 51] ( export PS1='$' > /dev/null 2>&1 || set prompt='$'\n)
2015:02:26 14:09:22 radical.pilot.MainProcess: [DEBUG ] write: [ 12] [ 28] ( printf 'HELLO_%d_SAGA\n' 2\n)
2015:02:26 14:09:22 radical.pilot.MainProcess: [DEBUG ] read : [ 12] [ 2396] (Last login: Thu Feb 26 08:00:4 ... __________________________\n\n)
2015:02:26 14:09:22 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (Last login: Thu Feb 26 08:00:49 2015 from poirot.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2015:02:26 14:09:22 radical.pilot.MainProcess: [DEBUG ] waiting for prompt trigger HELLO_2_SAGA: (5) (Last login: Thu Feb 26 08:00:49 2015 from poirot.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2015:02:26 14:09:23 radical.pilot.MainProcess: [DEBUG ] read : [ 12] [ 244] (----------------------- Project balances for user ardi ------------------------\n| Name Avail SUs Expires | Name Avail SUs Expires |\n| TG-MCB090174 138809 2015-09-30 | TG-TRA140016 -81080 2015-05-06 | \n)
2015:02:26 14:09:23 radical.pilot.MainProcess: [DEBUG ] read : [ 12] [ 405] (-------------------------- Disk quotas for user ardi --------------------------\n| Disk Usage (GB) Limit %Used File Usage Limit %Used |\n| /home1 0.1 5.0 2.37 5069 150000 3.38 |\n| /work 9.3 1024.0 0.91 619828 3000000 20.66 |\n-------------------------------------------------------------------------------\n)
2015:02:26 14:09:23 radical.pilot.MainProcess: [DEBUG ] read : [ 12] [ 221] (\nTip 80 (See "module help tacc_tips" for features or how to disable)\n\n Want to know detailed information about a file do: "stat your_file_or_dir" to find out the create, access and modify times, permission etc.\n\n)
2015:02:26 14:09:23 radical.pilot.MainProcess: [DEBUG ] read : [ 12] [ 53] (login3.stampede(1)$ $$HELLO_1_SAGA\n$$HELLO_2_SAGA\n$)
2015:02:26 14:09:23 radical.pilot.MainProcess: [DEBUG ] got shell prompt trigger (4) (
See "man slurm" or the Stampede user guide for more detailed information.
--> To see all the software that is available across all compilers and
mpi stacks, issue: "module spider"
--> To see which software packages are available with your currently loaded
compiler and mpi stack, issue: "module avail"
--> Stampede has three parallel file systems: $HOME (permanent,
quota'd, backed-up) $WORK (permanent, quota'd, not backed-up) and
$SCRATCH (high-speed purged storage). The "cdw" and "cds" aliases
are provided as a convenience to change to your $WORK and $SCRATCH
directories, respectively.
______________________________________________________________________________
----------------------- Project balances for user ardi ------------------------
| Name Avail SUs Expires | Name Avail SUs Expires |
| TG-MCB090174 138809 2015-09-30 | TG-TRA140016 -81080 2015-05-06 |
-------------------------- Disk quotas for user ardi --------------------------
| Disk Usage (GB) Limit %Used File Usage Limit %Used |
| /home1 0.1 5.0 2.37 5069 150000 3.38 |
| /work 9.3 1024.0 0.91 619828 3000000 20.66 |
-------------------------------------------------------------------------------
Tip 80 (See "module help tacc_tips" for features or how to disable)
Want to know detailed information about a file do: "stat your_file_or_dir" to find out the create, access and modify times, permission etc.
login3.stampede(1)$ $$HELLO_1_SAGA
$$HELLO_2_SAGA)
2015:02:26 14:09:23 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (
$)
2015:02:26 14:09:23 radical.pilot.MainProcess: [DEBUG ] Got initial shell prompt (5) (
$)
2015:02:26 14:09:23 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] running command shell: exec /bin/sh -i
2015:02:26 14:09:23 radical.pilot.MainProcess: [DEBUG ] write: [ 12] [ 47] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2015:02:26 14:09:23 radical.pilot.MainProcess: [DEBUG ] read : [ 12] [ 1] ($)
2015:02:26 14:09:23 radical.pilot.MainProcess: [DEBUG ] flush: [ 12] [ ] (flush pty read cache)
2015:02:26 14:09:24 radical.pilot.MainProcess: [DEBUG ] write: [ 12] [ 100] ( unset PROMPT_COMMAND ; unset HISTFILE ; PS1='PROMPT-$?->'; PS2=''; export PS1 PS2 2>&1 >/dev/null\n)
2015:02:26 14:09:24 radical.pilot.MainProcess: [DEBUG ] read : [ 12] [ 10] (PROMPT-0->)
2015:02:26 14:09:24 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] got new shell prompt
2015:02:26 14:09:24 radical.pilot.MainProcess: [DEBUG ] flush: [ 12] [ ] (flush pty read cache)
2015:02:26 14:09:24 radical.pilot.MainProcess: [DEBUG ] flush: [ 12] [ ] (flush pty read cache)
2015:02:26 14:09:24 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] run_sync: mkdir -p /work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/
2015:02:26 14:09:24 radical.pilot.MainProcess: [DEBUG ] write: [ 12] [ 80] (mkdir -p /work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/\n)
2015:02:26 14:09:24 radical.pilot.MainProcess: [DEBUG ] read : [ 12] [ 10] (PROMPT-0->)
2015:02:26 14:09:24 radical.pilot.MainProcess: [DEBUG ] flush: [ 12] [ ] (flush pty read cache)
2015:02:26 14:09:24 radical.pilot.MainProcess: [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f9ee0353190>
2015:02:26 14:09:24 radical.pilot.MainProcess: [INFO ] running: /usr/bin/env TERM=vt100 /usr/bin/sftp -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=no -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu
2015:02:26 14:09:24 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 43] (Connecting to stampede.tacc.utexas.edu...\n)
2015:02:26 14:09:27 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 6] (sftp> )
2015:02:26 14:09:27 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (Connecting to stampede.tacc.utexas.edu...
sftp> )
2015:02:26 14:09:27 radical.pilot.MainProcess: [DEBUG ] Got initial shell prompt (5) (Connecting to stampede.tacc.utexas.edu...
sftp> )
2015:02:26 14:09:27 radical.pilot.MainProcess: [DEBUG ] flush: [ 15] [ ] (flush pty read cache)
2015:02:26 14:09:27 radical.pilot.MainProcess: [DEBUG ] flush: [ 15] [ ] (flush pty read cache)
2015:02:26 14:09:27 radical.pilot.MainProcess: [DEBUG ] write: [ 15] [ 211] (mput "//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh" "/work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/default_bootstrapper.sh"\n)
2015:02:26 14:09:27 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 216] (mput "//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pil ot/bootstrapper/default_bootstrapper.sh" "/work/02998/ardi/radical.pilot.sandbox /pilot-54ef2905f8cdba5e6481a57e/default_bootstrapper.sh"\n)
2015:02:26 14:09:28 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 215] (Uploading //users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh to /work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/default_bootstrapper.sh\n)
2015:02:26 14:09:28 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 79] (//users/ardita/ExTASY-tools/lib/python2.7/sit 0% 0 0.0KB/s --:-- ETA)
2015:02:26 14:09:28 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 81] (//users/ardita/ExTASY-tools/lib/python2.7/sit 100% 14KB 14.3KB/s 00:00 \n)
2015:02:26 14:09:28 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 6] (sftp> )
2015:02:26 14:09:28 radical.pilot.MainProcess: [DEBUG ] copy done: ['mput', 'Uploading', '//users/ardita/ExTASY-tools/lib/python2.7/sit', 'sftp>']
2015:02:26 14:09:28 radical.pilot.MainProcess: [DEBUG ] Copying agent 'file://localhost//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py' to agent sandbox (sftp://stampede.tacc.utexas.edu/work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/).
2015:02:26 14:09:28 24164 PilotLauncherWorker-1 saga.ShellFile : [INFO ] init_instance file://localhost//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py
2015:02:26 14:09:28 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] flush: [ 10] [ ] (flush pty read cache)
2015:02:26 14:09:28 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] run_sync: true; test -r '//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py'
2015:02:26 14:09:28 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 10] [ 125] (true; test -r '//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py'\n)
2015:02:26 14:09:28 24164 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 10] [ 10] (PROMPT-0->)
2015:02:26 14:09:28 24164 PilotLauncherWorker-1 saga.ShellFile : [INFO ] file initialized (0)()
2015:02:26 14:09:28 radical.pilot.MainProcess: [DEBUG ] flush: [ 12] [ ] (flush pty read cache)
2015:02:26 14:09:28 radical.pilot.MainProcess: [DEBUG ] flush: [ 15] [ ] (flush pty read cache)
2015:02:26 14:09:28 radical.pilot.MainProcess: [DEBUG ] flush: [ 15] [ ] (flush pty read cache)
2015:02:26 14:09:29 radical.pilot.MainProcess: [DEBUG ] write: [ 15] [ 212] (mput "//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py" "/work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/radical-pilot-agent.py"\n)
2015:02:26 14:09:29 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 217] (mput "//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pil ot/agent/radical-pilot-agent-multicore.py" "/work/02998/ardi/radical.pilot.sandb ox/pilot-54ef2905f8cdba5e6481a57e/radical-pilot-agent.py"\n)
2015:02:26 14:09:29 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 216] (Uploading //users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py to /work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/radical-pilot-agent.py\n)
2015:02:26 14:09:29 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 79] (//users/ardita/ExTASY-tools/lib/python2.7/sit 0% 0 0.0KB/s --:-- ETA)
2015:02:26 14:09:30 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 79] (//users/ardita/ExTASY-tools/lib/python2.7/sit 100% 104KB 103.8KB/s 00:01 )
2015:02:26 14:09:30 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 2] (\n)
2015:02:26 14:09:30 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 6] (sftp> )
2015:02:26 14:09:30 radical.pilot.MainProcess: [DEBUG ] copy done: ['mput', 'Uploading', '//users/ardita/ExTASY-tools/lib/python2.7/sit', 'sftp>']
2015:02:26 14:09:30 radical.pilot.MainProcess: [DEBUG ] saga.job.Service ('slurm+ssh://stampede.tacc.utexas.edu/')
2015:02:26 14:09:30 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] Opening shell of type: ssh://stampede.tacc.utexas.edu
2015:02:26 14:09:30 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] PTYShell init <saga.utils.pty_shell.PTYShell object at 0x7f9ee0279950>
2015:02:26 14:09:30 24164 PilotLauncherWorker-1 saga.SLURMJobService : [INFO ] PTY prompt pattern: [\$#%>\]]\s*$
2015:02:26 14:09:30 radical.pilot.MainProcess: [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f9ee02797d0>
2015:02:26 14:09:30 radical.pilot.MainProcess: [INFO ] running: /usr/bin/env TERM=vt100 /usr/bin/ssh -t -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=no -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu
2015:02:26 14:09:30 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 33] ( export PS1='$' ; set prompt='$'\n)
2015:02:26 14:09:31 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 51] ( export PS1='$' > /dev/null 2>&1 || set prompt='$'\n)
2015:02:26 14:09:31 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 28] ( printf 'HELLO_%d_SAGA\n' 1\n)
2015:02:26 14:09:32 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 51] ( export PS1='$' > /dev/null 2>&1 || set prompt='$'\n)
2015:02:26 14:09:32 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 28] ( printf 'HELLO_%d_SAGA\n' 2\n)
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 2396] (Last login: Thu Feb 26 08:09:1 ... __________________________\n\n)
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (Last login: Thu Feb 26 08:09:12 2015 from poirot.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] waiting for prompt trigger HELLO_2_SAGA: (5) (Last login: Thu Feb 26 08:09:12 2015 from poirot.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 244] (----------------------- Project balances for user ardi ------------------------\n| Name Avail SUs Expires | Name Avail SUs Expires |\n| TG-MCB090174 138809 2015-09-30 | TG-TRA140016 -81080 2015-05-06 | \n)
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 405] (-------------------------- Disk quotas for user ardi --------------------------\n| Disk Usage (GB) Limit %Used File Usage Limit %Used |\n| /home1 0.1 5.0 2.37 5069 150000 3.38 |\n| /work 9.3 1024.0 0.91 619831 3000000 20.66 |\n-------------------------------------------------------------------------------\n)
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 380] (\nTip 90 (See "module help tacc_tips" for features or how to disable)\n\n In bash, the ~/.bash_profile or ~/.profile file is sourced on login shells and ~/.bashrc is sourced on non-login interactive shells. You may wish to source your ~/.bashrc inside your ~/.bash_profile or ~/.profile startup script so that you get the same behavior on both kinds of interactive shells.\n\n)
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 53] (login1.stampede(1)$ $$HELLO_1_SAGA\n$$HELLO_2_SAGA\n$)
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] got shell prompt trigger (4) (
See "man slurm" or the Stampede user guide for more detailed information.
--> To see all the software that is available across all compilers and
mpi stacks, issue: "module spider"
--> To see which software packages are available with your currently loaded
compiler and mpi stack, issue: "module avail"
--> Stampede has three parallel file systems: $HOME (permanent,
quota'd, backed-up) $WORK (permanent, quota'd, not backed-up) and
$SCRATCH (high-speed purged storage). The "cdw" and "cds" aliases
are provided as a convenience to change to your $WORK and $SCRATCH
directories, respectively.
______________________________________________________________________________
----------------------- Project balances for user ardi ------------------------
| Name Avail SUs Expires | Name Avail SUs Expires |
| TG-MCB090174 138809 2015-09-30 | TG-TRA140016 -81080 2015-05-06 |
-------------------------- Disk quotas for user ardi --------------------------
| Disk Usage (GB) Limit %Used File Usage Limit %Used |
| /home1 0.1 5.0 2.37 5069 150000 3.38 |
| /work 9.3 1024.0 0.91 619831 3000000 20.66 |
-------------------------------------------------------------------------------
Tip 90 (See "module help tacc_tips" for features or how to disable)
In bash, the ~/.bash_profile or ~/.profile file is sourced on login shells and ~/.bashrc is sourced on non-login interactive shells. You may wish to source your ~/.bashrc inside your ~/.bash_profile or ~/.profile startup script so that you get the same behavior on both kinds of interactive shells.
login1.stampede(1)$ $$HELLO_1_SAGA
$$HELLO_2_SAGA)
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (
$)
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] Got initial shell prompt (5) (
$)
2015:02:26 14:09:33 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] running command shell: exec /bin/sh -i
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 47] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 1] ($)
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] flush: [ 16] [ ] (flush pty read cache)
2015:02:26 14:09:33 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 100] ( unset PROMPT_COMMAND ; unset HISTFILE ; PS1='PROMPT-$?->'; PS2=''; export PS1 PS2 2>&1 >/dev/null\n)
2015:02:26 14:09:34 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 10] (PROMPT-0->)
2015:02:26 14:09:34 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] got new shell prompt
2015:02:26 14:09:34 radical.pilot.MainProcess: [DEBUG ] flush: [ 16] [ ] (flush pty read cache)
2015:02:26 14:09:34 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] Verifying existence of remote SLURM tools.
2015:02:26 14:09:34 radical.pilot.MainProcess: [DEBUG ] flush: [ 16] [ ] (flush pty read cache)
2015:02:26 14:09:34 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: which squeue
2015:02:26 14:09:34 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 13] (which squeue\n)
2015:02:26 14:09:34 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 27] (/usr/bin/squeue\nPROMPT-0->)
2015:02:26 14:09:34 radical.pilot.MainProcess: [DEBUG ] flush: [ 16] [ ] (flush pty read cache)
2015:02:26 14:09:34 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: which sbatch
2015:02:26 14:09:34 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 13] (which sbatch\n)
2015:02:26 14:09:34 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 27] (/usr/bin/sbatch\nPROMPT-0->)
2015:02:26 14:09:34 radical.pilot.MainProcess: [DEBUG ] flush: [ 16] [ ] (flush pty read cache)
2015:02:26 14:09:34 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: which scancel
2015:02:26 14:09:34 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 14] (which scancel\n)
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 28] (/usr/bin/scancel\nPROMPT-0->)
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] flush: [ 16] [ ] (flush pty read cache)
2015:02:26 14:09:35 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: which scontrol
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 15] (which scontrol\n)
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 29] (/usr/bin/scontrol\nPROMPT-0->)
2015:02:26 14:09:35 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] got cmd prompt (0)(/usr/bin/scontrol
)
2015:02:26 14:09:35 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] No username provided in URL slurm+ssh://stampede.tacc.utexas.edu/, so we are going to find it with whoami
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] flush: [ 16] [ ] (flush pty read cache)
2015:02:26 14:09:35 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: whoami
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 7] (whoami\n)
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 16] (ardi\nPROMPT-0->)
2015:02:26 14:09:35 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] Username detected as: ardi
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] Bootstrap command line: /bin/bash ['-l', 'default_bootstrapper.sh', "-n radicalpilot -s 54ef2905f8cdba5e6481a57c -p 54ef2905f8cdba5e6481a57e -t 60 -c 16 -v 0.23 -m extasy-db.epcc.ed.ac.uk:27017 -a extasy:extasyproject -i /opt/apps/python/epd/7.3.2/bin/python -e 'module purge' -e 'module load TACC' -e 'module load cluster' -e 'module load Linux' -e 'module load mvapich2' -e 'module load python/2.7.3-epd-7.3.2' -e 'module unload xalt' -e 'export TACC_DELETE_FILES=TRUE' -l SLURM -j SSH -k IBRUN -d 10 -b"]
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] Submitting SAGA job with description: {'Queue': 'normal', 'Executable': '/bin/bash', 'WorkingDirectory': '/work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/', 'Project': 'TG-MCB090174', 'WallTimeLimit': 60, 'Arguments': ['-l', 'default_bootstrapper.sh', "-n radicalpilot -s 54ef2905f8cdba5e6481a57c -p 54ef2905f8cdba5e6481a57e -t 60 -c 16 -v 0.23 -m extasy-db.epcc.ed.ac.uk:27017 -a extasy:extasyproject -i /opt/apps/python/epd/7.3.2/bin/python -e 'module purge' -e 'module load TACC' -e 'module load cluster' -e 'module load Linux' -e 'module load mvapich2' -e 'module load python/2.7.3-epd-7.3.2' -e 'module unload xalt' -e 'export TACC_DELETE_FILES=TRUE' -l SLURM -j SSH -k IBRUN -d 10 -b"], 'Error': 'AGENT.STDERR', 'Output': 'AGENT.STDOUT', 'TotalCPUCount': 16}
2015:02:26 14:09:35 24164 PilotLauncherWorker-1 saga.SLURMJobService : [WARNING ] number_of_processes not specified in submitted SLURM job description -- defaulting to 1 per total_cpu_count! (16)
2015:02:26 14:09:35 24164 PilotLauncherWorker-1 saga.SLURMJobService : [INFO ] Creating working directory /work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] flush: [ 16] [ ] (flush pty read cache)
2015:02:26 14:09:35 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: mkdir -p /work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 80] (mkdir -p /work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/\n)
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 10] (PROMPT-0->)
2015:02:26 14:09:35 24164 PilotLauncherWorker-1 saga.SLURMJobService : [INFO ] SLURM script generated:
#!/bin/sh
#SBATCH -J "SAGAPythonSLURMJob"
#SBATCH --ntasks=16
#SBATCH --cpus-per-task=1
#SBATCH -D /work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/
#SBATCH -o AGENT.STDOUT
#SBATCH -e AGENT.STDERR
#SBATCH -t 01:00:00
#SBATCH -p normal
#SBATCH -A TG-MCB090174
/bin/bash -l default_bootstrapper.sh -n radicalpilot -s 54ef2905f8cdba5e6481a57c -p 54ef2905f8cdba5e6481a57e -t 60 -c 16 -v 0.23 -m extasy-db.epcc.ed.ac.uk:27017 -a extasy:extasyproject -i /opt/apps/python/epd/7.3.2/bin/python -e 'module purge' -e 'module load TACC' -e 'module load cluster' -e 'module load Linux' -e 'module load mvapich2' -e 'module load python/2.7.3-epd-7.3.2' -e 'module unload xalt' -e 'export TACC_DELETE_FILES=TRUE' -l SLURM -j SSH -k IBRUN -d 10 -b
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] flush: [ 16] [ ] (flush pty read cache)
2015:02:26 14:09:35 radical.pilot.MainProcess: [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f9ee1d986d0>
2015:02:26 14:09:35 radical.pilot.MainProcess: [INFO ] running: /usr/bin/env TERM=vt100 /usr/bin/sftp -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=no -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu
2015:02:26 14:09:36 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 43] (Connecting to stampede.tacc.utexas.edu...\n)
2015:02:26 14:09:39 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 6] (sftp> )
2015:02:26 14:09:39 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (Connecting to stampede.tacc.utexas.edu...
sftp> )
2015:02:26 14:09:39 radical.pilot.MainProcess: [DEBUG ] Got initial shell prompt (5) (Connecting to stampede.tacc.utexas.edu...
sftp> )
2015:02:26 14:09:39 radical.pilot.MainProcess: [DEBUG ] flush: [ 17] [ ] (flush pty read cache)
2015:02:26 14:09:39 radical.pilot.MainProcess: [DEBUG ] flush: [ 17] [ ] (flush pty read cache)
2015:02:26 14:09:39 radical.pilot.MainProcess: [DEBUG ] write: [ 17] [ 49] (mput "/tmp/tmp_hdaADq.slurm" "tmp_hdaADq.slurm"\n)
2015:02:26 14:09:39 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 50] (mput "/tmp/tmp_hdaADq.slurm" "tmp_hdaADq.slurm"\n)
2015:02:26 14:09:39 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 71] (Uploading /tmp/tmp_hdaADq.slurm to /home1/02998/ardi/tmp_hdaADq.slurm\n)
2015:02:26 14:09:39 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 79] (/tmp/tmp_hdaADq.slurm 0% 0 0.0KB/s --:-- ETA)
2015:02:26 14:09:40 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 81] (/tmp/tmp_hdaADq.slurm 100% 769 0.8KB/s 00:01 \n)
2015:02:26 14:09:40 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 6] (sftp> )
2015:02:26 14:09:40 radical.pilot.MainProcess: [DEBUG ] copy done: ['mput', 'Uploading', '/tmp/tmp_hdaADq.slurm', 'sftp>']
2015:02:26 14:09:40 radical.pilot.MainProcess: [DEBUG ] flush: [ 16] [ ] (flush pty read cache)
2015:02:26 14:09:40 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: cat 'tmp_hdaADq.slurm' | sbatch && rm -vf 'tmp_hdaADq.slurm'
2015:02:26 14:09:40 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 61] (cat 'tmp_hdaADq.slurm' | sbatch && rm -vf 'tmp_hdaADq.slurm'\n)
2015:02:26 14:09:40 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 203] (-----------------------------------------------------------------\n Welcome to the Stampede Supercomputer \n-----------------------------------------------------------------\n\n)
2015:02:26 14:09:40 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 81] (--> Verifying valid submit host (login1)...OK\n--> Verifying valid jobname...OK\n)
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 38] (--> Enforcing max jobs per user...OK\n)
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 70] (--> Verifying availability of your home dir (/home1/02998/ardi)...OK\n)
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 69] (--> Verifying availability of your work dir (/work/02998/ardi)...OK\n)
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 75] (--> Verifying availability of your scratch dir (/scratch/02998/ardi)...OK\n)
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 202] (--> Verifying valid ssh keys...OK\n--> Verifying access to desired queue (normal)...OK\n--> Verifying job request is within current queue limits...OK\n--> Checking available allocation (TG-MCB090174)...)
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 4] (OK\n)
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 29] (Submitted batch job 4909606\n)
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 38] (removed `tmp_hdaADq.slurm'\nPROMPT-0->)
2015:02:26 14:09:45 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] staged/submit SLURM script (/tmp/tmp_hdaADq.slurm) (tmp_hdaADq.slurm) (0)
2015:02:26 14:09:45 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] started job [slurm+ssh://stampede.tacc.utexas.edu/]-[4909606]
2015:02:26 14:09:45 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] Batch system output:
-----------------------------------------------------------------
Welcome to the Stampede Supercomputer
-----------------------------------------------------------------
--> Verifying valid submit host (login1)...OK
--> Verifying valid jobname...OK
--> Enforcing max jobs per user...OK
--> Verifying availability of your home dir (/home1/02998/ardi)...OK
--> Verifying availability of your work dir (/work/02998/ardi)...OK
--> Verifying availability of your scratch dir (/scratch/02998/ardi)...OK
--> Verifying valid ssh keys...OK
--> Verifying access to desired queue (normal)...OK
--> Verifying job request is within current queue limits...OK
--> Checking available allocation (TG-MCB090174)...OK
Submitted batch job 4909606
removed `tmp_hdaADq.slurm'
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] flush: [ 16] [ ] (flush pty read cache)
2015:02:26 14:09:45 24164 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: scontrol show job 4909606
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 26] (scontrol show job 4909606\n)
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 890] (JobId=4909606 Name=SAGAPythonS ... 8cdba5e6481a57e/\n\nPROMPT-0->)
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] SAGA job submitted with job id [slurm+ssh://stampede.tacc.utexas.edu/]-[4909606]
2015:02:26 14:09:45 radical.pilot.MainProcess: [ERROR ] launcher PilotLauncherWorker-1 stopped
2015:02:26 14:09:45 radical.pilot.MainProcess: [ERROR ] pworker Thread-1 stopped launcher PilotLauncherWorker-1
2015:02:26 14:09:45 radical.pilot.MainProcess: [ERROR ] pmgr 54ef2905f8cdba5e6481a57d canceled worker Thread-1
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] Reconnected to existing ComputePilot {'uid': '54ef2905f8cdba5e6481a57e', 'stdout': None, 'start_time': None, 'resource_detail': {'cores_per_node': None, 'nodes': None}, 'submission_time': datetime.datetime(2015, 2, 26, 14, 9, 18, 104000), 'logfile': None, 'resource': u'stampede.tacc.utexas.edu', 'log': [], 'sandbox': u'sftp://stampede.tacc.utexas.edu/work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/', 'state': u'Launching', 'stop_time': None, 'stderr': None}
2015:02:26 14:09:45 radical.pilot.MainProcess: [ERROR ] pmgr 54ef2905f8cdba5e6481a57d cancels pilot 54ef2905f8cdba5e6481a57e
2015:02:26 14:09:45 radical.pilot.MainProcess: [INFO ] ComputePilot '54ef2905f8cdba5e6481a57e' state changed from 'Launching' to 'PendingActive'.
2015:02:26 14:09:45 radical.pilot.MainProcess: [INFO ] Sent 'COMMAND_CANCEL_PILOT' command to pilots ['54ef2905f8cdba5e6481a57e'].
2015:02:26 14:09:45 radical.pilot.MainProcess: [WARNING ] actively cancel pilot 54ef2905f8cdba5e6481a57e? state: PendingActive
2015:02:26 14:09:45 radical.pilot.MainProcess: [INFO ] actively cancel pilot 54ef2905f8cdba5e6481a57e ([slurm+ssh://stampede.tacc.utexas.edu/]-[4909606], slurm+ssh://stampede.tacc.utexas.edu/)
2015:02:26 14:09:45 radical.pilot.MainProcess: [DEBUG ] flush: [ 16] [ ] (flush pty read cache)
2015:02:26 14:09:46 24164 MainThread saga.SLURMJobService : [DEBUG ] run_sync: scancel 4909606
2015:02:26 14:09:46 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 16] (scancel 4909606\n)
2015:02:26 14:09:46 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 10] (PROMPT-0->)
2015:02:26 14:09:46 radical.pilot.MainProcess: [DEBUG ] Reconnected to existing ComputePilot {'uid': '54ef2905f8cdba5e6481a57e', 'stdout': None, 'start_time': None, 'resource_detail': {'cores_per_node': None, 'nodes': None}, 'submission_time': datetime.datetime(2015, 2, 26, 14, 9, 18, 104000), 'logfile': None, 'resource': u'stampede.tacc.utexas.edu', 'log': [<radical.pilot.logentry.Logentry object at 0x7f9ee031d810>, <radical.pilot.logentry.Logentry object at 0x7f9ee031dad0>, <radical.pilot.logentry.Logentry object at 0x7f9ee031da50>, <radical.pilot.logentry.Logentry object at 0x7f9ee031db10>, <radical.pilot.logentry.Logentry object at 0x7f9ee031da90>, <radical.pilot.logentry.Logentry object at 0x7f9ee031dc50>], 'sandbox': u'sftp://stampede.tacc.utexas.edu/work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/', 'state': 'Canceling', 'stop_time': None, 'stderr': None}
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] pmgr 54ef2905f8cdba5e6481a57d wait for pilot 54ef2905f8cdba5e6481a57e (Canceling)
2015:02:26 14:09:46 radical.pilot.MainProcess: [DEBUG ] Reconnected to existing ComputePilot {'uid': '54ef2905f8cdba5e6481a57e', 'stdout': None, 'start_time': None, 'resource_detail': {'cores_per_node': None, 'nodes': None}, 'submission_time': datetime.datetime(2015, 2, 26, 14, 9, 18, 104000), 'logfile': None, 'resource': u'stampede.tacc.utexas.edu', 'log': [<radical.pilot.logentry.Logentry object at 0x7f9ee031dad0>, <radical.pilot.logentry.Logentry object at 0x7f9ee031da50>, <radical.pilot.logentry.Logentry object at 0x7f9ee031db10>, <radical.pilot.logentry.Logentry object at 0x7f9ee031da90>, <radical.pilot.logentry.Logentry object at 0x7f9ee031dc50>, <radical.pilot.logentry.Logentry object at 0x7f9ee031dc10>], 'sandbox': u'sftp://stampede.tacc.utexas.edu/work/02998/ardi/radical.pilot.sandbox/pilot-54ef2905f8cdba5e6481a57e/', 'state': 'Canceling', 'stop_time': None, 'stderr': None}
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] pmgr 54ef2905f8cdba5e6481a57d canceled pilot 54ef2905f8cdba5e6481a57e
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] pmgr 54ef2905f8cdba5e6481a57d stops worker Thread-1
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] pworker Thread-1 stopping
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] pworker Thread-1 stops launcher PilotLauncherWorker-1
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] launcher PilotLauncherWorker-1 stopping
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] launcher PilotLauncherWorker-1 stopped
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] pworker Thread-1 stopped launcher PilotLauncherWorker-1
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] pworker Thread-1 stopped
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] pmgr 54ef2905f8cdba5e6481a57d stopped worker Thread-1
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] pmgr 54ef2905f8cdba5e6481a57d closed
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] session 54ef2905f8cdba5e6481a57c closed pmgr None
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] session 54ef2905f8cdba5e6481a57c closes umgr 54ef290ef8cdba5e6481a57f
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] uworker Thread-3 stopping
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] uworker Thread-3 stops itransfer InputFileTransferWorker-1
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] itransfer InputFileTransferWorker-1 stopping
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] itransfer InputFileTransferWorker-1 stopped
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] uworker Thread-3 stopped itransfer InputFileTransferWorker-1
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] uworker Thread-3 stops itransfer InputFileTransferWorker-2
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] itransfer InputFileTransferWorker-2 stopping
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] itransfer InputFileTransferWorker-2 stopped
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] uworker Thread-3 stopped itransfer InputFileTransferWorker-2
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] uworker Thread-3 stops otransfer OutputFileTransferWorker-1
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] otransfer OutputFileTransferWorker-1 stopping
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] otransfer OutputFileTransferWorker-1 stopped
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] uworker Thread-3 stopped otransfer OutputFileTransferWorker-1
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] uworker Thread-3 stops otransfer OutputFileTransferWorker-2
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] otransfer OutputFileTransferWorker-2 stopping
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] otransfer OutputFileTransferWorker-2 stopped
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] uworker Thread-3 stopped otransfer OutputFileTransferWorker-2
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] uworker Thread-3 stopped
2015:02:26 14:09:46 radical.pilot.MainProcess: [INFO ] Closed UnitManager 54ef290ef8cdba5e6481a57f.
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] session 54ef2905f8cdba5e6481a57c closed umgr None
2015:02:26 14:09:46 radical.pilot.MainProcess: [INFO ] Deleted session 54ef2905f8cdba5e6481a57c from database.
2015:02:26 14:09:46 radical.pilot.MainProcess: [ERROR ] session None closed
ExTASY version : 0.1.3-beta-1-g2cdf59c
Session UID: 54ef2905f8cdba5e6481a57c
Pilot UID : 54ef2905f8cdba5e6481a57e
Loading kernel configurations from /users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/ensemblemd/mdkernels/configs/gromacs.json
Loading kernel configurations from /users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/ensemblemd/mdkernels/configs/amber.json
Loading kernel configurations from /users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/ensemblemd/mdkernels/configs/mmpbsa.json
Loading kernel configurations from /users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/ensemblemd/mdkernels/configs/test.json
Loading kernel configurations from /users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/ensemblemd/mdkernels/configs/coco.json
Loading kernel configurations from /users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/ensemblemd/mdkernels/configs/lsdmap.json
Loading kernel configurations from /users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/ensemblemd/mdkernels/configs/sleep.json
Loading kernel configurations from /users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/ensemblemd/mdkernels/configs/namd.json
Preprocessing stage ....
Expecting 0 ncdf files in backup/iter-1 folder
An error occurred: [Errno 2] No such file or directory: 'backup/iter-1'
Closing session, exiting now ...
[Callback]: ComputePilot '54ef2905f8cdba5e6481a57e' state changed to Launching.
[Callback]: ComputePilot '54ef2905f8cdba5e6481a57e' state changed to PendingActive.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment