Skip to content

Instantly share code, notes, and snippets.

@ashkurti
Created November 5, 2014 11:46
Show Gist options
  • Save ashkurti/4a3924825b57a7f6a5d4 to your computer and use it in GitHub Desktop.
Save ashkurti/4a3924825b57a7f6a5d4 to your computer and use it in GitHub Desktop.
Output of the bag of tasks on stampede including the debug option - November 5
[ExTASY-tools] ardita@tirith 132% setenv RADICAL_PILOT_DBURL 'mongodb://ec2-184-72-89-141.compute-1.amazonaws.com:27017/'
[ExTASY-tools] ardita@tirith 133% python simple_bot.py
2014:11:05 11:43:57 radical.pilot.MainProcess: [INFO ] radical.pilot version: 0.21 (v0.21)
2014:11:05 11:43:57 21037 MainThread saga : [INFO ] saga-python version: 0.22
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.context.myproxy
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.context.myproxy for saga.Context API with URL scheme(s) ['myproxy://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.context.x509
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.context.x509 for saga.Context API with URL scheme(s) ['x509://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.context.ssh
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.context.ssh for saga.Context API with URL scheme(s) ['ssh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.context.userpass
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.context.userpass for saga.Context API with URL scheme(s) ['userpass://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.shell.shell_job
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.shell.shell_job for saga.job.Service API with URL scheme(s) ['fork://', 'local://', 'ssh://', 'gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.shell.shell_job for saga.job.Job API with URL scheme(s) ['fork://', 'local://', 'ssh://', 'gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.shell.shell_file
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.shell.shell_file for saga.namespace.Directory API with URL scheme(s) ['file://', 'local://', 'sftp://', 'gsisftp://', 'ssh://', 'gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.shell.shell_file for saga.namespace.Entry API with URL scheme(s) ['file://', 'local://', 'sftp://', 'gsisftp://', 'ssh://', 'gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.shell.shell_file for saga.filesystem.Directory API with URL scheme(s) ['file://', 'local://', 'sftp://', 'gsisftp://', 'ssh://', 'gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.shell.shell_file for saga.filesystem.File API with URL scheme(s) ['file://', 'local://', 'sftp://', 'gsisftp://', 'ssh://', 'gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.shell.shell_resource
2014:11:05 11:43:57 21037 MainThread saga.Engine : [WARNING ] Skipping adaptor saga.adaptors.shell.shell_resource: beta versions are disabled (v0.1.beta)
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.redis.redis_advert
2014:11:05 11:43:57 21037 MainThread saga.Engine : [WARNING ] Skipping adaptor saga.adaptors.redis.redis_advert 1: module loading failed: No module named redis
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.sge.sgejob
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.sge.sgejob for saga.job.Service API with URL scheme(s) ['sge://', 'sge+ssh://', 'sge+gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.sge.sgejob for saga.job.Job API with URL scheme(s) ['sge://', 'sge+ssh://', 'sge+gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.pbs.pbsjob
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.pbs.pbsjob for saga.job.Service API with URL scheme(s) ['pbs://', 'pbs+ssh://', 'pbs+gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.pbs.pbsjob for saga.job.Job API with URL scheme(s) ['pbs://', 'pbs+ssh://', 'pbs+gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.lsf.lsfjob
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.lsf.lsfjob for saga.job.Service API with URL scheme(s) ['lsf://', 'lsf+ssh://', 'lsf+gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.lsf.lsfjob for saga.job.Job API with URL scheme(s) ['lsf://', 'lsf+ssh://', 'lsf+gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.condor.condorjob
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.condor.condorjob for saga.job.Service API with URL scheme(s) ['condor://', 'condor+ssh://', 'condor+gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.condor.condorjob for saga.job.Job API with URL scheme(s) ['condor://', 'condor+ssh://', 'condor+gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.slurm.slurm_job
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.slurm.slurm_job for saga.job.Service API with URL scheme(s) ['slurm://', 'slurm+ssh://', 'slurm+gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.slurm.slurm_job for saga.job.Job API with URL scheme(s) ['slurm://', 'slurm+ssh://', 'slurm+gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.http.http_file
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.http.http_file for saga.namespace.Entry API with URL scheme(s) ['http://', 'https://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.http.http_file for saga.filesystem.File API with URL scheme(s) ['http://', 'https://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.aws.ec2_resource
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.aws.ec2_resource for saga.Context API with URL scheme(s) ['ec2://', 'ec2_keypair://', 'openstack://', 'eucalyptus://', 'euca://', 'aws://', 'amazon://', 'http://', 'https://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.aws.ec2_resource for saga.resource.Manager API with URL scheme(s) ['ec2://', 'ec2_keypair://', 'openstack://', 'eucalyptus://', 'euca://', 'aws://', 'amazon://', 'http://', 'https://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.aws.ec2_resource for saga.resource.Compute API with URL scheme(s) ['ec2://', 'ec2_keypair://', 'openstack://', 'eucalyptus://', 'euca://', 'aws://', 'amazon://', 'http://', 'https://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.loadl.loadljob
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.loadl.loadljob for saga.job.Service API with URL scheme(s) ['loadl://', 'loadl+ssh://', 'loadl+gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.loadl.loadljob for saga.job.Job API with URL scheme(s) ['loadl://', 'loadl+ssh://', 'loadl+gsissh://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Loading adaptor saga.adaptors.globus_online.go_file
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.globus_online.go_file for saga.namespace.Directory API with URL scheme(s) ['go+gsisftp://', 'go+gridftp://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.globus_online.go_file for saga.namespace.Entry API with URL scheme(s) ['go+gsisftp://', 'go+gridftp://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.globus_online.go_file for saga.filesystem.Directory API with URL scheme(s) ['go+gsisftp://', 'go+gridftp://']
2014:11:05 11:43:57 21037 MainThread saga.Engine : [INFO ] Register adaptor saga.adaptors.globus_online.go_file for saga.filesystem.File API with URL scheme(s) ['go+gsisftp://', 'go+gridftp://']
2014:11:05 11:43:57 21037 MainThread radical.utils : [DEBUG ] lm new manager
2014:11:05 11:43:57 21037 MainThread saga.saga.adaptor.ssh : [INFO ] ignore ssh key at /users/ardita/.ssh/known_hosts.old (no public key: /users/ardita/.ssh/known_hosts.old.pub)
2014:11:05 11:43:57 21037 MainThread saga.saga.adaptor.ssh : [INFO ] ignore ssh key at /users/ardita/.ssh/k.dat (no public key: /users/ardita/.ssh/k.dat.pub)
2014:11:05 11:43:57 21037 MainThread saga.saga.adaptor.ssh : [INFO ] default ssh key at /users/ardita/.ssh/id_rsa
2014:11:05 11:43:57 21037 MainThread saga.saga.adaptor.ssh : [INFO ] ignore ssh key at /users/ardita/.ssh/environment (no public key: /users/ardita/.ssh/environment.pub)
2014:11:05 11:43:57 21037 MainThread saga.saga.adaptor.ssh : [INFO ] ignore ssh key at /users/ardita/.ssh/known_hosts (no public key: /users/ardita/.ssh/known_hosts.pub)
2014:11:05 11:43:57 21037 MainThread saga.saga.adaptor.ssh : [INFO ] ignore ssh key at /users/ardita/.ssh/authorized_keys (no public key: /users/ardita/.ssh/authorized_keys.pub)
2014:11:05 11:43:57 21037 MainThread saga.saga.adaptor.ssh : [INFO ] ignore ssh key at /users/ardita/.ssh/config (no public key: /users/ardita/.ssh/config.pub)
2014:11:05 11:43:57 21037 MainThread saga.ContextSSH : [INFO ] init SSH context for key at '/users/ardita/.ssh/id_rsa' done
2014:11:05 11:43:57 21037 MainThread saga.DefaultSession : [DEBUG ] default context [saga.adaptor.ssh ] : {'LifeTime' : '-1', 'Type' : 'ssh', 'UserCert' : '/users/ardita/.ssh/id_rsa.pub', 'UserKey' : '/users/ardita/.ssh/id_rsa'}
2014:11:05 11:43:57 radical.pilot.MainProcess: [INFO ] using database url mongodb://ec2-184-72-89-141.compute-1.amazonaws.com:27017/
2014:11:05 11:43:57 radical.pilot.MainProcess: [INFO ] using database name radicalpilot
2014:11:05 11:43:57 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for futuregrid.sierra
2014:11:05 11:43:57 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for futuregrid.alamo
2014:11:05 11:43:57 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for futuregrid.hotel
2014:11:05 11:43:57 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for futuregrid.india
2014:11:05 11:43:57 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for lrz.supermuc
2014:11:05 11:43:57 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for epsrc.archer
2014:11:05 11:43:57 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for rice.davinci
2014:11:05 11:43:57 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for local.localhost
2014:11:05 11:43:58 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for radical.tutorial
2014:11:05 11:43:58 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for das4.fs2
2014:11:05 11:43:58 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for ncar.yellowstone
2014:11:05 11:43:58 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for iu.bigred2
2014:11:05 11:43:58 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for iu.quarry
2014:11:05 11:43:58 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for xsede.stampede
2014:11:05 11:43:58 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for xsede.trestles
2014:11:05 11:43:58 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for xsede.gordon
2014:11:05 11:43:58 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for xsede.blacklight
2014:11:05 11:43:58 radical.pilot.MainProcess: [INFO ] Loaded resource configurations for xsede.lonestar
2014:11:05 11:43:58 radical.pilot.MainProcess: [INFO ] New Session created{'uid': '545a0d7e54737d522db8e082', 'created': datetime.datetime(2014, 11, 5, 11, 43, 58, 123539), 'database_auth': ':', 'database_name': 'radicalpilot', 'database_url': 'mongodb://ec2-184-72-89-141.compute-1.amazonaws.com:27017/radicalpilot', 'last_reconnect': None}.
Initializing Pilot Manager ...
2014:11:05 11:43:58 radical.pilot.MainProcess: [DEBUG ] Worker thread (ID: Thread-1[140095524960000]) for PilotManager 545a0d7e54737d522db8e083 started.
Submitting Compute Pilot to Pilot Manager ...
2014:11:05 11:43:58 radical.pilot.MainProcess: [WARNING ] using alias 'xsede.stampede' for deprecated resource key 'stampede.tacc.utexas.edu'
2014:11:05 11:43:58 radical.pilot.MainProcess: [DEBUG ] saga.utils.PTYShell ('sftp://stampede.tacc.utexas.edu/')
2014:11:05 11:43:58 radical.pilot.MainProcess: [DEBUG ] PTYShell init <saga.utils.pty_shell.PTYShell object at 0x7f6a8db73510>
2014:11:05 11:43:58 radical.pilot.MainProcess: [INFO ] PTY prompt pattern: [\$#%>\]]\s*$
2014:11:05 11:43:58 radical.pilot.MainProcess: [DEBUG ] open master pty for [ssh] [ardi@stampede.tacc.utexas.edu] ardi: /usr/bin/env TERM=vt100 "/usr/bin/ssh" -t -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=auto -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu'
2014:11:05 11:43:58 radical.pilot.MainProcess: [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f6a8e7d7e10>
2014:11:05 11:43:58 radical.pilot.MainProcess: [INFO ] running: /usr/bin/env TERM=vt100 /usr/bin/ssh -t -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=auto -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu
2014:11:05 11:43:58 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 33] ( export PS1='$' ; set prompt='$'\n)
2014:11:05 11:43:58 radical.pilot.MainProcess: [DEBUG ] Connected to MongoDB. Serving requests for PilotManager 545a0d7e54737d522db8e083.
2014:11:05 11:43:59 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 51] ( export PS1='$' > /dev/null 2>&1 || set prompt='$'\n)
2014:11:05 11:43:59 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 28] ( printf 'HELLO_%d_SAGA\n' 1\n)
2014:11:05 11:44:00 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 51] ( export PS1='$' > /dev/null 2>&1 || set prompt='$'\n)
2014:11:05 11:44:00 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 28] ( printf 'HELLO_%d_SAGA\n' 2\n)
2014:11:05 11:44:01 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 51] ( export PS1='$' > /dev/null 2>&1 || set prompt='$'\n)
2014:11:05 11:44:01 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 28] ( printf 'HELLO_%d_SAGA\n' 3\n)
2014:11:05 11:44:01 radical.pilot.MainProcess: [DEBUG ] read : [ 5] [ 2396] (Last login: Wed Nov 5 05:24:0 ... __________________________\n\n)
2014:11:05 11:44:01 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (Last login: Wed Nov 5 05:24:01 2014 from tirith.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2014:11:05 11:44:01 radical.pilot.MainProcess: [DEBUG ] waiting for prompt trigger HELLO_3_SAGA: (5) (Last login: Wed Nov 5 05:24:01 2014 from tirith.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2014:11:05 11:44:02 radical.pilot.MainProcess: [DEBUG ] read : [ 5] [ 244] (----------------------- Project balances for user ardi ------------------------\n| Name Avail SUs Expires | Name Avail SUs Expires |\n| TG-MCB090174 79220 2015-09-30 | TG-TRA140016 -81080 2015-05-06 | \n)
2014:11:05 11:44:02 radical.pilot.MainProcess: [DEBUG ] write: [ 5] [ 28] ( printf 'HELLO_%d_SAGA\n' 3\n)
2014:11:05 11:44:02 radical.pilot.MainProcess: [DEBUG ] read : [ 5] [ 405] (-------------------------- Disk quotas for user ardi --------------------------\n| Disk Usage (GB) Limit %Used File Usage Limit %Used |\n| /home1 0.4 5.0 8.05 16480 150000 10.99 |\n| /work 0.0 1024.0 0.00 2683 3000000 0.09 |\n-------------------------------------------------------------------------------\n)
2014:11:05 11:44:02 radical.pilot.MainProcess: [DEBUG ] read : [ 5] [ 137] (\nTip 53 (See "module help tacc_tips" for features or how to disable)\n\n Did you know that emacs and vim support spell checking?\n\n)
2014:11:05 11:44:02 radical.pilot.MainProcess: [DEBUG ] read : [ 5] [ 69] (login2.stampede(1)$ $$HELLO_1_SAGA\n$$HELLO_2_SAGA\n$$HELLO_3_SAGA\n$)
2014:11:05 11:44:02 radical.pilot.MainProcess: [DEBUG ] got shell prompt trigger (4) (
See "man slurm" or the Stampede user guide for more detailed information.
--> To see all the software that is available across all compilers and
mpi stacks, issue: "module spider"
--> To see which software packages are available with your currently loaded
compiler and mpi stack, issue: "module avail"
--> Stampede has three parallel file systems: $HOME (permanent,
quota'd, backed-up) $WORK (permanent, quota'd, not backed-up) and
$SCRATCH (high-speed purged storage). The "cdw" and "cds" aliases
are provided as a convenience to change to your $WORK and $SCRATCH
directories, respectively.
______________________________________________________________________________
----------------------- Project balances for user ardi ------------------------
| Name Avail SUs Expires | Name Avail SUs Expires |
| TG-MCB090174 79220 2015-09-30 | TG-TRA140016 -81080 2015-05-06 |
-------------------------- Disk quotas for user ardi --------------------------
| Disk Usage (GB) Limit %Used File Usage Limit %Used |
| /home1 0.4 5.0 8.05 16480 150000 10.99 |
| /work 0.0 1024.0 0.00 2683 3000000 0.09 |
-------------------------------------------------------------------------------
Tip 53 (See "module help tacc_tips" for features or how to disable)
Did you know that emacs and vim support spell checking?
login2.stampede(1)$ $$HELLO_1_SAGA
$$HELLO_2_SAGA
$$HELLO_3_SAGA)
2014:11:05 11:44:02 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (
$)
2014:11:05 11:44:02 radical.pilot.MainProcess: [DEBUG ] Got initial shell prompt (5) (
$)
2014:11:05 11:44:02 radical.pilot.MainProcess: [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f6a8db73950>
2014:11:05 11:44:02 radical.pilot.MainProcess: [INFO ] running: /usr/bin/env TERM=vt100 /usr/bin/ssh -t -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=auto -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu
2014:11:05 11:44:02 radical.pilot.MainProcess: [DEBUG ] write: [ 6] [ 33] ( export PS1='$' ; set prompt='$'\n)
2014:11:05 11:44:03 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 2396] (Last login: Wed Nov 5 05:44:0 ... __________________________\n\n)
2014:11:05 11:44:03 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (Last login: Wed Nov 5 05:44:01 2014 from tirith.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2014:11:05 11:44:03 radical.pilot.MainProcess: [DEBUG ] Got initial shell prompt (5) (Last login: Wed Nov 5 05:44:01 2014 from tirith.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2014:11:05 11:44:03 radical.pilot.MainProcess: [DEBUG ] running command shell: exec /bin/sh -i
2014:11:05 11:44:03 radical.pilot.MainProcess: [DEBUG ] write: [ 6] [ 47] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2014:11:05 11:44:04 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 244] (----------------------- Project balances for user ardi ------------------------\n| Name Avail SUs Expires | Name Avail SUs Expires |\n| TG-MCB090174 79220 2015-09-30 | TG-TRA140016 -81080 2015-05-06 | \n)
2014:11:05 11:44:04 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 405] (-------------------------- Disk quotas for user ardi --------------------------\n| Disk Usage (GB) Limit %Used File Usage Limit %Used |\n| /home1 0.4 5.0 8.05 16480 150000 10.99 |\n| /work 0.0 1024.0 0.00 2683 3000000 0.09 |\n-------------------------------------------------------------------------------\n)
2014:11:05 11:44:04 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 237] (\nTip 76 (See "module help tacc_tips" for features or how to disable)\n\n Want to mkdir and cd with a single command? You can use the following function in bash or zsh:\n function mkdircd () { mkdir -p "$1" && eval cd "$1"; }\n\n)
2014:11:05 11:44:04 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 21] (login2.stampede(1)$ $)
2014:11:05 11:44:04 radical.pilot.MainProcess: [DEBUG ] write: [ 6] [ 100] ( unset PROMPT_COMMAND ; unset HISTFILE ; PS1='PROMPT-$?->'; PS2=''; export PS1 PS2 2>&1 >/dev/null\n)
2014:11:05 11:44:04 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 1] ($)
2014:11:05 11:44:04 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 10] (PROMPT-0->)
2014:11:05 11:44:04 radical.pilot.MainProcess: [DEBUG ] got new shell prompt
2014:11:05 11:44:04 radical.pilot.MainProcess: [DEBUG ] run_sync: echo "WORKDIR: $WORK"
2014:11:05 11:44:04 radical.pilot.MainProcess: [DEBUG ] write: [ 6] [ 22] (echo "WORKDIR: $WORK"\n)
2014:11:05 11:44:04 radical.pilot.MainProcess: [DEBUG ] read : [ 6] [ 37] (WORKDIR: /work/02998/ardi\nPROMPT-0->)
2014:11:05 11:44:04 radical.pilot.MainProcess: [DEBUG ] Determined remote working directory for sftp://stampede.tacc.utexas.edu/: '/work/02998/ardi'
2014:11:05 11:44:04 radical.pilot.MainProcess: [DEBUG ] PTYShell del <saga.utils.pty_shell.PTYShell object at 0x7f6a8db73510>
2014:11:05 11:44:05 radical.pilot.MainProcess: [DEBUG ] PTYProcess del <saga.utils.pty_process.PTYProcess object at 0x7f6a8db73950>
Initializing Unit Manager ...
2014:11:05 11:44:05 radical.pilot.MainProcess: [INFO ] Launching ComputePilot {u'state': u'PendingLaunch', u'commands': [], u'description': {u'project': u'TG-MCB090174', u'resource': u'stampede.tacc.utexas.edu', u'queue': None, u'sandbox': None, u'cleanup': True, u'pilot_agent_priv': None, u'access_schema': None, u'memory': None, u'cores': 1, u'runtime': 10}, u'sagajobid': None, u'started': None, u'cores_per_node': None, u'output_transfer_started': None, u'finished': None, u'submitted': datetime.datetime(2014, 11, 5, 11, 44, 4, 845000), u'output_transfer_finished': None, u'sandbox': u'sftp://stampede.tacc.utexas.edu/work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084/', u'pilotmanager': u'545a0d7e54737d522db8e083', u'unitmanager': None, u'heartbeat': None, u'statehistory': [{u'timestamp': datetime.datetime(2014, 11, 5, 11, 44, 4, 844000), u'state': u'PendingLaunch'}], u'input_transfer_started': None, u'_id': ObjectId('545a0d7e54737d522db8e084'), u'input_transfer_finished': None, u'nodes': None, u'log': []}
2014:11:05 11:44:05 radical.pilot.MainProcess: [WARNING ] using alias 'xsede.stampede' for deprecated resource key 'stampede.tacc.utexas.edu'
2014:11:05 11:44:05 radical.pilot.MainProcess: [INFO ] Using pilot agent /users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py
2014:11:05 11:44:05 radical.pilot.MainProcess: [INFO ] Using bootstrapper /users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh
2014:11:05 11:44:05 radical.pilot.MainProcess: [DEBUG ] Copying bootstrapper 'file://localhost//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh' to agent sandbox (sftp://stampede.tacc.utexas.edu/work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084//default_bootstrapper.sh).
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [INFO ] init_instance file://localhost//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm check pool ([])
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm create pool for file://localhost/shell_file_adaptor_command_shell (<type 'str'>) (<radical.utils.lease_manager.LeaseManager object at 0x7f6a8e78be90>)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object for file://localhost/shell_file_adaptor_command_shell (0)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] []
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm create object for file://localhost/shell_file_adaptor_command_shell
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] PTYShell init <saga.utils.pty_shell.PTYShell object at 0x7f6a8db2be10>
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [INFO ] PTY prompt pattern: [\$#%>\]]\s*$
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] open master pty for [sh] [localhost] ardita: /usr/bin/env TERM=vt100 "/bin/tcsh" -i'
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f6a8e7d7ed0>
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [INFO ] running: /usr/bin/env TERM=vt100 /bin/tcsh -i
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 6] [ 33] ( export PS1='$' ; set prompt='$'\n)
2014:11:05 11:44:05 radical.pilot.MainProcess: [INFO ] Starting InputFileTransferWorker
2014:11:05 11:44:05 radical.pilot.MainProcess: [INFO ] Starting InputFileTransferWorker
2014:11:05 11:44:05 radical.pilot.MainProcess: [DEBUG ] Worker thread (ID: Thread-3[140095483000576]) for UnitManager 545a0d8554737d522db8e085 started.
2014:11:05 11:44:05 radical.pilot.MainProcess: [INFO ] Loaded scheduler: DirectSubmissionScheduler.
Registering Compute Pilot with Unit Manager ...
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 74] (ModuleCmd_Load.c(208):ERROR:105: Unable to locate a modulefile for 'mpi'\n)
2014:11:05 11:44:05 radical.pilot.MainProcess: [DEBUG ] Connected to MongoDB. Serving requests for UnitManager 545a0d8554737d522db8e085.
2014:11:05 11:44:05 radical.pilot.MainProcess: [DEBUG ] Connected to MongoDB. Serving requests for UnitManager 545a0d8554737d522db8e085.
2014:11:05 11:44:05 radical.pilot.MainProcess: [DEBUG ] Connected to MongoDB. Serving requests for UnitManager 545a0d8554737d522db8e085.
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 3] ( \n)
2014:11:05 11:44:05 radical.pilot.MainProcess: [DEBUG ] Connected to MongoDB. Serving requests for UnitManager 545a0d8554737d522db8e085.
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 45] (Welcome to tirith.pharm.nottingham.ac.uk\n \n)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 46] (Operating System: CentOS release 6.6 (Final)\n)
2014:11:05 11:44:05 radical.pilot.MainProcess: [INFO ] ComputePilot '545a0d7e54737d522db8e084' state changed from 'PendingLaunch' to 'Launching'.
[Callback]: ComputePilot '545a0d7e54737d522db8e084' state: Launching.
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 94] (CPU: AuthenticAMD Dual Core AMD Opteron(tm) Processor 275\n 4 2193 MHz x86_64 Processors\n)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 62] (Main memory size: 3831 Mbytes\nSwap memory size: 3967 Mbytes\n)
Submit Compute Units to Unit Manager ...
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 41] (Removable media:\n /dev/sr0: CD-224E-N\n)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 3] ( \n)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 316] (Disk quotas for user ardita (uid 1285): \n Filesystem blocks quota limit grace files quota limit grace\n192.168.1.253:/users/\n 279G 300G 1109G 106k 0 0 \n \nCurrently Loaded Modulefiles:\n 1) sysinit 2) intel/14.0.3 3) python/2.7.8\n)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 3] ( \n)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 41] (Condor queue: 0 jobs currently in queue\n)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 131] ( \nFor information about this machine and software available on it go to:\n http://holmes.cancres.nottingham.ac.uk/facilities/\n \n)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 413] (*********************************************************************\n\n DO NOT RUN COMPUTATIONALLY INTENSIVE TASKS ON THIS MACHINE \n\n Please either use the Condor Pool, or run interactively on the \n fastest machine that does not appear to currently be in use; \n darling.pharm.nottingham.ac.uk\n\n*********************************************************************\n)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 81] (ardita@tirith 101% export PS1='$' ; set prompt='$'\nexport: Command not found.\n)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 6] [ 1] ($)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] got initial shell prompt (5) (ModuleCmd_Load.c(208):ERROR:105: Unable to locate a modulefile for 'mpi'
Welcome to tirith.pharm.nottingham.ac.uk
Operating System: CentOS release 6.6 (Final)
CPU: AuthenticAMD Dual Core AMD Opteron(tm) Processor 275
4 2193 MHz x86_64 Processors
Main memory size: 3831 Mbytes
Swap memory size: 3967 Mbytes
Removable media:
/dev/sr0: CD-224E-N
Disk quotas for user ardita (uid 1285):
Filesystem blocks quota limit grace files quota limit grace
192.168.1.253:/users/
279G 300G 1109G 106k 0 0
Currently Loaded Modulefiles:
1) sysinit 2) intel/14.0.3 3) python/2.7.8
Condor queue: 0 jobs currently in queue
For information about this machine and software available on it go to:
http://holmes.cancres.nottingham.ac.uk/facilities/
*********************************************************************
DO NOT RUN COMPUTATIONALLY INTENSIVE TASKS ON THIS MACHINE
Please either use the Condor Pool, or run interactively on the
fastest machine that does not appear to currently be in use;
darling.pharm.nottingham.ac.uk
*********************************************************************
ardita@tirith 101% export PS1='$' ; set prompt='$'
export: Command not found.
$)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] Got initial shell prompt (5) (ModuleCmd_Load.c(208):ERROR:105: Unable to locate a modulefile for 'mpi'
Welcome to tirith.pharm.nottingham.ac.uk
Operating System: CentOS release 6.6 (Final)
CPU: AuthenticAMD Dual Core AMD Opteron(tm) Processor 275
4 2193 MHz x86_64 Processors
Main memory size: 3831 Mbytes
Swap memory size: 3967 Mbytes
Removable media:
/dev/sr0: CD-224E-N
Disk quotas for user ardita (uid 1285):
Filesystem blocks quota limit grace files quota limit grace
192.168.1.253:/users/
279G 300G 1109G 106k 0 0
Currently Loaded Modulefiles:
1) sysinit 2) intel/14.0.3 3) python/2.7.8
Condor queue: 0 jobs currently in queue
For information about this machine and software available on it go to:
http://holmes.cancres.nottingham.ac.uk/facilities/
*********************************************************************
DO NOT RUN COMPUTATIONALLY INTENSIVE TASKS ON THIS MACHINE
Please either use the Condor Pool, or run interactively on the
fastest machine that does not appear to currently be in use;
darling.pharm.nottingham.ac.uk
*********************************************************************
ardita@tirith 101% export PS1='$' ; set prompt='$'
export: Command not found.
$)
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f6a8c459310>
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [INFO ] running: /usr/bin/env TERM=vt100 /bin/tcsh -i
2014:11:05 11:44:05 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 13] [ 33] ( export PS1='$' ; set prompt='$'\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 74] (ModuleCmd_Load.c(208):ERROR:105: Unable to locate a modulefile for 'mpi'\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 3] ( \n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 45] (Welcome to tirith.pharm.nottingham.ac.uk\n \n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 46] (Operating System: CentOS release 6.6 (Final)\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 94] (CPU: AuthenticAMD Dual Core AMD Opteron(tm) Processor 275\n 4 2193 MHz x86_64 Processors\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 62] (Main memory size: 3831 Mbytes\nSwap memory size: 3967 Mbytes\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 18] (Removable media:\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 23] ( /dev/sr0: CD-224E-N\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 3] ( \n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 230] (Disk quotas for user ardita (uid 1285): \n Filesystem blocks quota limit grace files quota limit grace\n192.168.1.253:/users/\n 279G 300G 1109G 106k 0 0 \n \n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 31] (Currently Loaded Modulefiles:\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 55] ( 1) sysinit 2) intel/14.0.3 3) python/2.7.8\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 3] ( \n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 41] (Condor queue: 0 jobs currently in queue\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 131] ( \nFor information about this machine and software available on it go to:\n http://holmes.cancres.nottingham.ac.uk/facilities/\n \n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 413] (*********************************************************************\n\n DO NOT RUN COMPUTATIONALLY INTENSIVE TASKS ON THIS MACHINE \n\n Please either use the Condor Pool, or run interactively on the \n fastest machine that does not appear to currently be in use; \n darling.pharm.nottingham.ac.uk\n\n*********************************************************************\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 53] (ardita@tirith 101% export PS1='$' ; set prompt='$'\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 29] (export: Command not found.\n$)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] got initial shell prompt (5) (ModuleCmd_Load.c(208):ERROR:105: Unable to locate a modulefile for 'mpi'
Welcome to tirith.pharm.nottingham.ac.uk
Operating System: CentOS release 6.6 (Final)
CPU: AuthenticAMD Dual Core AMD Opteron(tm) Processor 275
4 2193 MHz x86_64 Processors
Main memory size: 3831 Mbytes
Swap memory size: 3967 Mbytes
Removable media:
/dev/sr0: CD-224E-N
Disk quotas for user ardita (uid 1285):
Filesystem blocks quota limit grace files quota limit grace
192.168.1.253:/users/
279G 300G 1109G 106k 0 0
Currently Loaded Modulefiles:
1) sysinit 2) intel/14.0.3 3) python/2.7.8
Condor queue: 0 jobs currently in queue
For information about this machine and software available on it go to:
http://holmes.cancres.nottingham.ac.uk/facilities/
*********************************************************************
DO NOT RUN COMPUTATIONALLY INTENSIVE TASKS ON THIS MACHINE
Please either use the Condor Pool, or run interactively on the
fastest machine that does not appear to currently be in use;
darling.pharm.nottingham.ac.uk
*********************************************************************
ardita@tirith 101% export PS1='$' ; set prompt='$'
export: Command not found.
$)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] Got initial shell prompt (5) (ModuleCmd_Load.c(208):ERROR:105: Unable to locate a modulefile for 'mpi'
Welcome to tirith.pharm.nottingham.ac.uk
Operating System: CentOS release 6.6 (Final)
CPU: AuthenticAMD Dual Core AMD Opteron(tm) Processor 275
4 2193 MHz x86_64 Processors
Main memory size: 3831 Mbytes
Swap memory size: 3967 Mbytes
Removable media:
/dev/sr0: CD-224E-N
Disk quotas for user ardita (uid 1285):
Filesystem blocks quota limit grace files quota limit grace
192.168.1.253:/users/
279G 300G 1109G 106k 0 0
Currently Loaded Modulefiles:
1) sysinit 2) intel/14.0.3 3) python/2.7.8
Condor queue: 0 jobs currently in queue
For information about this machine and software available on it go to:
http://holmes.cancres.nottingham.ac.uk/facilities/
*********************************************************************
DO NOT RUN COMPUTATIONALLY INTENSIVE TASKS ON THIS MACHINE
Please either use the Condor Pool, or run interactively on the
fastest machine that does not appear to currently be in use;
darling.pharm.nottingham.ac.uk
*********************************************************************
ardita@tirith 101% export PS1='$' ; set prompt='$'
export: Command not found.
$)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] running command shell: exec /bin/sh -i
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 13] [ 47] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 48] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 8] (sh-4.1$ )
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 13] [ 100] ( unset PROMPT_COMMAND ; unset HISTFILE ; PS1='PROMPT-$?->'; PS2=''; export PS1 PS2 2>&1 >/dev/null\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 10] (PROMPT-0->)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] got new shell prompt
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] run_sync: cd /users/ardita
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 13] [ 17] (cd /users/ardita\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 10] (PROMPT-0->)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0000 was unused for 0.0s
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8db2bdd0> use: True -- new!
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] run_sync: true; test -r '//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh'
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 13] [ 123] (true; test -r '//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh'\n)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 10] (PROMPT-0->)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [INFO ] file initialized (0)()
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm check pool (['file://localhost/shell_file_adaptor_command_shell'])
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm create pool for fork://localhost/shell_file_adaptor_command_shell (<type 'str'>) (<radical.utils.lease_manager.LeaseManager object at 0x7f6a8e78be90>)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object for fork://localhost/shell_file_adaptor_command_shell (0)
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] []
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm create object for fork://localhost/shell_file_adaptor_command_shell
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] PTYShell init <saga.utils.pty_shell.PTYShell object at 0x7f6a8c459610>
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [INFO ] PTY prompt pattern: [\$#%>\]]\s*$
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f6a8e78b710>
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [INFO ] running: /usr/bin/env TERM=vt100 /bin/tcsh -i
2014:11:05 11:44:06 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 14] [ 33] ( export PS1='$' ; set prompt='$'\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 74] (ModuleCmd_Load.c(208):ERROR:105: Unable to locate a modulefile for 'mpi'\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 3] ( \n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 45] (Welcome to tirith.pharm.nottingham.ac.uk\n \n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 46] (Operating System: CentOS release 6.6 (Final)\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 94] (CPU: AuthenticAMD Dual Core AMD Opteron(tm) Processor 275\n 4 2193 MHz x86_64 Processors\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 62] (Main memory size: 3831 Mbytes\nSwap memory size: 3967 Mbytes\n)
2014:11:05 11:44:07 radical.pilot.MainProcess: [INFO ] RUN ComputeUnit '545a0d8554737d522db8e08a' state changed from 'New' to 'PendingExecution'.
[Callback]: ComputeUnit '545a0d8554737d522db8e08a' state: PendingExecution.
2014:11:05 11:44:07 radical.pilot.MainProcess: [INFO ] RUN ComputeUnit '545a0d8554737d522db8e08e' state changed from 'New' to 'PendingExecution'.
[Callback]: ComputeUnit '545a0d8554737d522db8e08e' state: PendingExecution.
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 18] (Removable media:\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 23] ( /dev/sr0: CD-224E-N\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 3] ( \n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 230] (Disk quotas for user ardita (uid 1285): \n Filesystem blocks quota limit grace files quota limit grace\n192.168.1.253:/users/\n 279G 300G 1109G 106k 0 0 \n \n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 31] (Currently Loaded Modulefiles:\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 55] ( 1) sysinit 2) intel/14.0.3 3) python/2.7.8\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 3] ( \n)
2014:11:05 11:44:07 radical.pilot.MainProcess: [INFO ] RUN ComputeUnit '545a0d8554737d522db8e08b' state changed from 'New' to 'PendingExecution'.
[Callback]: ComputeUnit '545a0d8554737d522db8e08b' state: PendingExecution.
2014:11:05 11:44:07 radical.pilot.MainProcess: [INFO ] RUN ComputeUnit '545a0d8554737d522db8e086' state changed from 'New' to 'PendingExecution'.
[Callback]: ComputeUnit '545a0d8554737d522db8e086' state: PendingExecution.
2014:11:05 11:44:07 radical.pilot.MainProcess: [INFO ] RUN ComputeUnit '545a0d8554737d522db8e08f' state changed from 'New' to 'PendingExecution'.
[Callback]: ComputeUnit '545a0d8554737d522db8e08f' state: PendingExecution.
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 41] (Condor queue: 0 jobs currently in queue\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 131] ( \nFor information about this machine and software available on it go to:\n http://holmes.cancres.nottingham.ac.uk/facilities/\n \n)
2014:11:05 11:44:07 radical.pilot.MainProcess: [INFO ] RUN ComputeUnit '545a0d8554737d522db8e08c' state changed from 'New' to 'PendingExecution'.
[Callback]: ComputeUnit '545a0d8554737d522db8e08c' state: PendingExecution.
2014:11:05 11:44:07 radical.pilot.MainProcess: [INFO ] RUN ComputeUnit '545a0d8554737d522db8e089' state changed from 'New' to 'PendingExecution'.
[Callback]: ComputeUnit '545a0d8554737d522db8e089' state: PendingExecution.
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 413] (*********************************************************************\n\n DO NOT RUN COMPUTATIONALLY INTENSIVE TASKS ON THIS MACHINE \n\n Please either use the Condor Pool, or run interactively on the \n fastest machine that does not appear to currently be in use; \n darling.pharm.nottingham.ac.uk\n\n*********************************************************************\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 53] (ardita@tirith 101% export PS1='$' ; set prompt='$'\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 29] (export: Command not found.\n$)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] got initial shell prompt (5) (ModuleCmd_Load.c(208):ERROR:105: Unable to locate a modulefile for 'mpi'
Welcome to tirith.pharm.nottingham.ac.uk
Operating System: CentOS release 6.6 (Final)
CPU: AuthenticAMD Dual Core AMD Opteron(tm) Processor 275
4 2193 MHz x86_64 Processors
Main memory size: 3831 Mbytes
Swap memory size: 3967 Mbytes
Removable media:
/dev/sr0: CD-224E-N
Disk quotas for user ardita (uid 1285):
Filesystem blocks quota limit grace files quota limit grace
192.168.1.253:/users/
279G 300G 1109G 106k 0 0
Currently Loaded Modulefiles:
1) sysinit 2) intel/14.0.3 3) python/2.7.8
Condor queue: 0 jobs currently in queue
For information about this machine and software available on it go to:
http://holmes.cancres.nottingham.ac.uk/facilities/
*********************************************************************
DO NOT RUN COMPUTATIONALLY INTENSIVE TASKS ON THIS MACHINE
Please either use the Condor Pool, or run interactively on the
fastest machine that does not appear to currently be in use;
darling.pharm.nottingham.ac.uk
*********************************************************************
ardita@tirith 101% export PS1='$' ; set prompt='$'
export: Command not found.
$)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] Got initial shell prompt (5) (ModuleCmd_Load.c(208):ERROR:105: Unable to locate a modulefile for 'mpi'
Welcome to tirith.pharm.nottingham.ac.uk
Operating System: CentOS release 6.6 (Final)
CPU: AuthenticAMD Dual Core AMD Opteron(tm) Processor 275
4 2193 MHz x86_64 Processors
Main memory size: 3831 Mbytes
Swap memory size: 3967 Mbytes
Removable media:
/dev/sr0: CD-224E-N
Disk quotas for user ardita (uid 1285):
Filesystem blocks quota limit grace files quota limit grace
192.168.1.253:/users/
279G 300G 1109G 106k 0 0
Currently Loaded Modulefiles:
1) sysinit 2) intel/14.0.3 3) python/2.7.8
Condor queue: 0 jobs currently in queue
For information about this machine and software available on it go to:
http://holmes.cancres.nottingham.ac.uk/facilities/
*********************************************************************
DO NOT RUN COMPUTATIONALLY INTENSIVE TASKS ON THIS MACHINE
Please either use the Condor Pool, or run interactively on the
fastest machine that does not appear to currently be in use;
darling.pharm.nottingham.ac.uk
*********************************************************************
ardita@tirith 101% export PS1='$' ; set prompt='$'
export: Command not found.
$)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] running command shell: exec /bin/sh -i
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 14] [ 47] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 48] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 8] (sh-4.1$ )
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 14] [ 100] ( unset PROMPT_COMMAND ; unset HISTFILE ; PS1='PROMPT-$?->'; PS2=''; export PS1 PS2 2>&1 >/dev/null\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 10] (PROMPT-0->)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] got new shell prompt
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] run_sync: cd /users/ardita
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 14] [ 17] (cd /users/ardita\n)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 14] [ 10] (PROMPT-0->)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0001 was unused for 0.0s
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8c4595d0> use: True -- new!
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm check pool (['fork://localhost/shell_file_adaptor_command_shell', 'file://localhost/shell_file_adaptor_command_shell'])
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm create pool for sftp://stampede.tacc.utexas.edu/shell_file_adaptor_command_shell (<type 'str'>) (<radical.utils.lease_manager.LeaseManager object at 0x7f6a8e78be90>)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object for sftp://stampede.tacc.utexas.edu/shell_file_adaptor_command_shell (0)
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] []
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm create object for sftp://stampede.tacc.utexas.edu/shell_file_adaptor_command_shell
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] PTYShell init <saga.utils.pty_shell.PTYShell object at 0x7f6a8c459c90>
2014:11:05 11:44:07 21037 PilotLauncherWorker-1 saga.ShellFile : [INFO ] PTY prompt pattern: [\$#%>\]]\s*$
2014:11:05 11:44:07 radical.pilot.MainProcess: [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f6a8e79bfd0>
2014:11:05 11:44:07 radical.pilot.MainProcess: [INFO ] running: /usr/bin/env TERM=vt100 /usr/bin/ssh -t -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=auto -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu
2014:11:05 11:44:07 radical.pilot.MainProcess: [DEBUG ] write: [ 15] [ 33] ( export PS1='$' ; set prompt='$'\n)
2014:11:05 11:44:07 radical.pilot.MainProcess: [INFO ] Scheduled ComputeUnits [<radical.pilot.compute_unit.ComputeUnit object at 0x7f6a8db3ac10>, <radical.pilot.compute_unit.ComputeUnit object at 0x7f6a8c459150>, <radical.pilot.compute_unit.ComputeUnit object at 0x7f6a8db3ae50>, <radical.pilot.compute_unit.ComputeUnit object at 0x7f6a8db3a290>, <radical.pilot.compute_unit.ComputeUnit object at 0x7f6a8c4591d0>, <radical.pilot.compute_unit.ComputeUnit object at 0x7f6a8db3af10>, <radical.pilot.compute_unit.ComputeUnit object at 0x7f6a8db3ab50>, <radical.pilot.compute_unit.ComputeUnit object at 0x7f6a8db3aa90>, <radical.pilot.compute_unit.ComputeUnit object at 0x7f6a8db3a9d0>, <radical.pilot.compute_unit.ComputeUnit object at 0x7f6a8c459050>] for execution on ComputePilot '545a0d7e54737d522db8e084'.
2014:11:05 11:44:07 radical.pilot.MainProcess: [INFO ] 0 units remain unscheduled
Waiting for CUs to complete ...
2014:11:05 11:44:07 radical.pilot.MainProcess: [INFO ] RUN ComputeUnit '545a0d8554737d522db8e088' state changed from 'New' to 'PendingExecution'.
[Callback]: ComputeUnit '545a0d8554737d522db8e088' state: PendingExecution.
2014:11:05 11:44:07 radical.pilot.MainProcess: [INFO ] RUN ComputeUnit '545a0d8554737d522db8e087' state changed from 'New' to 'PendingExecution'.
[Callback]: ComputeUnit '545a0d8554737d522db8e087' state: PendingExecution.
2014:11:05 11:44:07 radical.pilot.MainProcess: [INFO ] RUN ComputeUnit '545a0d8554737d522db8e08d' state changed from 'New' to 'PendingExecution'.
[Callback]: ComputeUnit '545a0d8554737d522db8e08d' state: PendingExecution.
2014:11:05 11:44:08 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 2396] (Last login: Wed Nov 5 05:44:0 ... __________________________\n\n)
2014:11:05 11:44:08 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (Last login: Wed Nov 5 05:44:03 2014 from tirith.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2014:11:05 11:44:08 radical.pilot.MainProcess: [DEBUG ] Got initial shell prompt (5) (Last login: Wed Nov 5 05:44:03 2014 from tirith.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2014:11:05 11:44:08 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] running command shell: exec /bin/sh -i
2014:11:05 11:44:08 radical.pilot.MainProcess: [DEBUG ] write: [ 15] [ 47] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2014:11:05 11:44:09 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 244] (----------------------- Project balances for user ardi ------------------------\n| Name Avail SUs Expires | Name Avail SUs Expires |\n| TG-MCB090174 79220 2015-09-30 | TG-TRA140016 -81080 2015-05-06 | \n)
2014:11:05 11:44:09 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 405] (-------------------------- Disk quotas for user ardi --------------------------\n| Disk Usage (GB) Limit %Used File Usage Limit %Used |\n| /home1 0.4 5.0 8.05 16480 150000 10.99 |\n| /work 0.0 1024.0 0.00 2683 3000000 0.09 |\n-------------------------------------------------------------------------------\n)
2014:11:05 11:44:09 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 158] (\nTip 92 (See "module help tacc_tips" for features or how to disable)\n\n Bash scripts can to simple arithmetic: i=1; j=2; ((k=j*10+i)); echo $k -> 21\n\n)
2014:11:05 11:44:09 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 21] (login2.stampede(1)$ $)
2014:11:05 11:44:09 radical.pilot.MainProcess: [DEBUG ] write: [ 15] [ 100] ( unset PROMPT_COMMAND ; unset HISTFILE ; PS1='PROMPT-$?->'; PS2=''; export PS1 PS2 2>&1 >/dev/null\n)
2014:11:05 11:44:09 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 1] ($)
2014:11:05 11:44:09 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 10] (PROMPT-0->)
2014:11:05 11:44:09 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] got new shell prompt
2014:11:05 11:44:09 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0002 was unused for 0.0s
2014:11:05 11:44:09 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8c459c50> use: True -- new!
2014:11:05 11:44:09 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] run_sync: mkdir -p /work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084/
2014:11:05 11:44:09 radical.pilot.MainProcess: [DEBUG ] write: [ 15] [ 80] (mkdir -p /work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084/\n)
2014:11:05 11:44:09 radical.pilot.MainProcess: [DEBUG ] read : [ 15] [ 10] (PROMPT-0->)
2014:11:05 11:44:09 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object
2014:11:05 11:44:09 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8c459c50> for sftp://stampede.tacc.utexas.edu/shell_file_adaptor_command_shell
2014:11:05 11:44:09 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0002 was leased for 0.2s
2014:11:05 11:44:09 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm check pool (['sftp://stampede.tacc.utexas.edu/shell_file_adaptor_command_shell', 'fork://localhost/shell_file_adaptor_command_shell', 'file://localhost/shell_file_adaptor_command_shell'])
2014:11:05 11:44:09 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object for sftp://stampede.tacc.utexas.edu/shell_file_adaptor_command_shell (1)
2014:11:05 11:44:09 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] [<radical.utils.lease_manager._LeaseObject object at 0x7f6a8c459c50>]
2014:11:05 11:44:09 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8c459c50> use: False
2014:11:05 11:44:09 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8c459c50> use: ok!
2014:11:05 11:44:09 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0002 was unused for 0.0s
2014:11:05 11:44:09 radical.pilot.MainProcess: [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f6a8c40d4d0>
2014:11:05 11:44:09 radical.pilot.MainProcess: [INFO ] running: /usr/bin/env TERM=vt100 /usr/bin/sftp -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=auto -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu
2014:11:05 11:44:09 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 33] ( export PS1='$' ; set prompt='$'\n)
2014:11:05 11:44:09 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 43] (Connecting to stampede.tacc.utexas.edu...\n)
2014:11:05 11:44:11 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 40] (sftp> export PS1='$' ; set prompt='$'\n)
2014:11:05 11:44:11 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 24] (Invalid command.\nsftp> )
2014:11:05 11:44:11 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (Connecting to stampede.tacc.utexas.edu...
sftp> export PS1='$' ; set prompt='$'
Invalid command.
sftp> )
2014:11:05 11:44:11 radical.pilot.MainProcess: [DEBUG ] Got initial shell prompt (5) (Connecting to stampede.tacc.utexas.edu...
sftp> export PS1='$' ; set prompt='$'
Invalid command.
sftp> )
2014:11:05 11:44:11 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 213] (mput "//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh" "/work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084/default_bootstrapper.sh" \n\n)
2014:11:05 11:44:11 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 217] (mput "//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh" "/work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084/default_bootstrapper.sh" \n)
2014:11:05 11:44:11 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 215] (Uploading //users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/bootstrapper/default_bootstrapper.sh to /work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084/default_bootstrapper.sh\n)
2014:11:05 11:44:11 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 79] (//users/ardita/ExTASY-tools/lib/python2.7/sit 0% 0 0.0KB/s --:-- ETA)
2014:11:05 11:44:11 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 81] (//users/ardita/ExTASY-tools/lib/python2.7/sit 100% 14KB 14.3KB/s 00:00 \n)
2014:11:05 11:44:11 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 14] (sftp> \nsftp> )
2014:11:05 11:44:11 radical.pilot.MainProcess: [DEBUG ] copy done: ['sftp>']
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8c459c50> for sftp://stampede.tacc.utexas.edu/shell_file_adaptor_command_shell
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0002 was leased for 2.3s
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8db2bdd0> for file://localhost/shell_file_adaptor_command_shell
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0000 was leased for 5.2s
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8c4595d0> for fork://localhost/shell_file_adaptor_command_shell
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0001 was leased for 4.3s
2014:11:05 11:44:11 radical.pilot.MainProcess: [DEBUG ] Copying agent 'file://localhost//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py' to agent sandbox (sftp://stampede.tacc.utexas.edu/work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084/).
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 saga.ShellFile : [INFO ] init_instance file://localhost//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm check pool (['sftp://stampede.tacc.utexas.edu/shell_file_adaptor_command_shell', 'fork://localhost/shell_file_adaptor_command_shell', 'file://localhost/shell_file_adaptor_command_shell'])
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object for file://localhost/shell_file_adaptor_command_shell (1)
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] [<radical.utils.lease_manager._LeaseObject object at 0x7f6a8db2bdd0>]
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8db2bdd0> use: False
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8db2bdd0> use: ok!
2014:11:05 11:44:11 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0000 was unused for 0.0s
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] run_sync: true; test -r '//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py'
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] write: [ 13] [ 125] (true; test -r '//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py'\n)
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 saga.ShellFile : [DEBUG ] read : [ 13] [ 10] (PROMPT-0->)
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 saga.ShellFile : [INFO ] file initialized (0)()
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm check pool (['sftp://stampede.tacc.utexas.edu/shell_file_adaptor_command_shell', 'fork://localhost/shell_file_adaptor_command_shell', 'file://localhost/shell_file_adaptor_command_shell'])
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object for fork://localhost/shell_file_adaptor_command_shell (1)
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] [<radical.utils.lease_manager._LeaseObject object at 0x7f6a8c4595d0>]
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8c4595d0> use: False
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8c4595d0> use: ok!
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0001 was unused for 0.0s
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm check pool (['sftp://stampede.tacc.utexas.edu/shell_file_adaptor_command_shell', 'fork://localhost/shell_file_adaptor_command_shell', 'file://localhost/shell_file_adaptor_command_shell'])
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object for sftp://stampede.tacc.utexas.edu/shell_file_adaptor_command_shell (1)
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] [<radical.utils.lease_manager._LeaseObject object at 0x7f6a8c459c50>]
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8c459c50> use: False
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm lease object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8c459c50> use: ok!
2014:11:05 11:44:12 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0002 was unused for 0.0s
2014:11:05 11:44:12 radical.pilot.MainProcess: [DEBUG ] write: [ 16] [ 214] (mput "//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py" "/work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084/radical-pilot-agent.py" \n\n)
2014:11:05 11:44:12 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 218] (mput "//users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py" "/work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084/radical-pilot-agent.py" \n)
2014:11:05 11:44:12 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 216] (Uploading //users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/agent/radical-pilot-agent-multicore.py to /work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084/radical-pilot-agent.py\n)
2014:11:05 11:44:12 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 79] (//users/ardita/ExTASY-tools/lib/python2.7/sit 0% 0 0.0KB/s --:-- ETA)
2014:11:05 11:44:13 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 81] (//users/ardita/ExTASY-tools/lib/python2.7/sit 100% 102KB 101.7KB/s 00:01 \n)
2014:11:05 11:44:13 radical.pilot.MainProcess: [DEBUG ] read : [ 16] [ 14] (sftp> \nsftp> )
2014:11:05 11:44:13 radical.pilot.MainProcess: [DEBUG ] copy done: ['sftp>']
2014:11:05 11:44:13 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object
2014:11:05 11:44:13 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8c459c50> for sftp://stampede.tacc.utexas.edu/shell_file_adaptor_command_shell
2014:11:05 11:44:13 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0002 was leased for 1.5s
2014:11:05 11:44:13 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object
2014:11:05 11:44:13 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8db2bdd0> for file://localhost/shell_file_adaptor_command_shell
2014:11:05 11:44:13 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0000 was leased for 1.6s
2014:11:05 11:44:13 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object
2014:11:05 11:44:13 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lm release object <radical.utils.lease_manager._LeaseObject object at 0x7f6a8c4595d0> for fork://localhost/shell_file_adaptor_command_shell
2014:11:05 11:44:13 21037 PilotLauncherWorker-1 radical.utils : [DEBUG ] lo.0001 was leased for 1.5s
2014:11:05 11:44:13 radical.pilot.MainProcess: [DEBUG ] saga.job.Service ('slurm+ssh://stampede.tacc.utexas.edu/')
2014:11:05 11:44:13 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] Opening shell of type: ssh://stampede.tacc.utexas.edu
2014:11:05 11:44:13 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] PTYShell init <saga.utils.pty_shell.PTYShell object at 0x7f6a8db2bc90>
2014:11:05 11:44:13 21037 PilotLauncherWorker-1 saga.SLURMJobService : [INFO ] PTY prompt pattern: [\$#%>\]]\s*$
2014:11:05 11:44:13 radical.pilot.MainProcess: [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f6a8e78b790>
2014:11:05 11:44:13 radical.pilot.MainProcess: [INFO ] running: /usr/bin/env TERM=vt100 /usr/bin/ssh -t -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=auto -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu
2014:11:05 11:44:13 radical.pilot.MainProcess: [DEBUG ] write: [ 17] [ 33] ( export PS1='$' ; set prompt='$'\n)
2014:11:05 11:44:14 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 2396] (Last login: Wed Nov 5 05:44:0 ... __________________________\n\n)
2014:11:05 11:44:14 radical.pilot.MainProcess: [DEBUG ] got initial shell prompt (5) (Last login: Wed Nov 5 05:44:08 2014 from tirith.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2014:11:05 11:44:14 radical.pilot.MainProcess: [DEBUG ] Got initial shell prompt (5) (Last login: Wed Nov 5 05:44:08 2014 from tirith.pharm.nottingham.ac.uk
------------------------------------------------------------------------------
Welcome to the Stampede Supercomputer
Texas Advanced Computing Center, The University of Texas at Austin
------------------------------------------------------------------------------
** Unauthorized use/access is prohibited. **
If you log on to this computer system, you acknowledge your awareness
of and concurrence with the UT Austin Acceptable Use Policy. The
University will prosecute violators to the full extent of the law.
TACC Usage Policies:
http://www.tacc.utexas.edu/user-services/usage-policies/
______________________________________________________________________________
Questions and Problem Reports:
--> XD Projects: help@xsede.org (email)
--> TACC Projects: portal.tacc.utexas.edu (web)
Documentation: http://www.tacc.utexas.edu/user-services/user-guides/
User News: http://www.tacc.utexas.edu/user-services/user-news/
______________________________________________________________________________
Welcome to Stampede, *please* read these important system notes:
--> Stampede is currently running the SLURM resource manager to
schedule all compute resources. Example SLURM job scripts are
available on the system at /share/doc/slurm
To run an interactive shell, issue:
srun -p development -t 0:30:00 -n 32 --pty /bin/bash -l
To submit a batch job, issue: sbatch job.mpi
To show all queued jobs, issue: showq
To kill a queued job, issue: scancel <jobId>
)
2014:11:05 11:44:14 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] running command shell: exec /bin/sh -i
2014:11:05 11:44:14 radical.pilot.MainProcess: [DEBUG ] write: [ 17] [ 47] ( stty -echo ; unset HISTFILE ; exec /bin/sh -i\n)
2014:11:05 11:44:15 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 244] (----------------------- Project balances for user ardi ------------------------\n| Name Avail SUs Expires | Name Avail SUs Expires |\n| TG-MCB090174 79220 2015-09-30 | TG-TRA140016 -81080 2015-05-06 | \n)
2014:11:05 11:44:15 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 405] (-------------------------- Disk quotas for user ardi --------------------------\n| Disk Usage (GB) Limit %Used File Usage Limit %Used |\n| /home1 0.4 5.0 8.05 16480 150000 10.99 |\n| /work 0.0 1024.0 0.00 2686 3000000 0.09 |\n-------------------------------------------------------------------------------\n)
2014:11:05 11:44:15 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 240] (\nTip 61 (See "module help tacc_tips" for features or how to disable)\n\n Have you every tried to find the man-page for a function that has the same name as the command? Try "man 2 <name>" or "man 3 <name>" for example "man 2 time".\n\n)
2014:11:05 11:44:15 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 21] (login2.stampede(1)$ $)
2014:11:05 11:44:15 radical.pilot.MainProcess: [DEBUG ] write: [ 17] [ 100] ( unset PROMPT_COMMAND ; unset HISTFILE ; PS1='PROMPT-$?->'; PS2=''; export PS1 PS2 2>&1 >/dev/null\n)
2014:11:05 11:44:15 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 1] ($)
2014:11:05 11:44:15 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 10] (PROMPT-0->)
2014:11:05 11:44:15 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] got new shell prompt
2014:11:05 11:44:15 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] Verifying existence of remote SLURM tools.
2014:11:05 11:44:15 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: which squeue
2014:11:05 11:44:15 radical.pilot.MainProcess: [DEBUG ] write: [ 17] [ 13] (which squeue\n)
2014:11:05 11:44:15 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 27] (/usr/bin/squeue\nPROMPT-0->)
2014:11:05 11:44:15 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: which sbatch
2014:11:05 11:44:15 radical.pilot.MainProcess: [DEBUG ] write: [ 17] [ 13] (which sbatch\n)
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 27] (/usr/bin/sbatch\nPROMPT-0->)
2014:11:05 11:44:16 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: which scancel
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] write: [ 17] [ 14] (which scancel\n)
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 28] (/usr/bin/scancel\nPROMPT-0->)
2014:11:05 11:44:16 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: which scontrol
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] write: [ 17] [ 15] (which scontrol\n)
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 29] (/usr/bin/scontrol\nPROMPT-0->)
2014:11:05 11:44:16 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] got cmd prompt (0)(/usr/bin/scontrol
)
2014:11:05 11:44:16 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] No username provided in URL slurm+ssh://stampede.tacc.utexas.edu/, so we are going to find it with whoami
2014:11:05 11:44:16 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: whoami
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] write: [ 17] [ 7] (whoami\n)
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 16] (ardi\nPROMPT-0->)
2014:11:05 11:44:16 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] Username detected as: ardi
2014:11:05 11:44:16 radical.pilot.MainProcess: [INFO ] request cleanup for pilot 545a0d7e54737d522db8e084
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] Bootstrap command line: /bin/bash ['-l', 'default_bootstrapper.sh', "-n radicalpilot -s 545a0d7e54737d522db8e082 -p 545a0d7e54737d522db8e084 -t 10 -d 10 -c 1 -v 0.21 -m ec2-184-72-89-141.compute-1.amazonaws.com:27017 -a : -i /opt/apps/python/epd/7.3.2/bin/python -e 'module purge' -e 'module load TACC' -e 'module load cluster' -e 'module load Linux' -e 'module load mvapich2' -e 'module load python/2.7.3-epd-7.3.2' -e 'module unload xalt' -e 'export TACC_KEEP_FILES=FALSE' -l SLURM -j SSH -k IBRUN -x luve"]
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] Submitting SAGA job with description: {'Queue': 'normal', 'Executable': '/bin/bash', 'WorkingDirectory': '/work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084', 'Project': 'TG-MCB090174', 'WallTimeLimit': 10, 'Arguments': ['-l', 'default_bootstrapper.sh', "-n radicalpilot -s 545a0d7e54737d522db8e082 -p 545a0d7e54737d522db8e084 -t 10 -d 10 -c 1 -v 0.21 -m ec2-184-72-89-141.compute-1.amazonaws.com:27017 -a : -i /opt/apps/python/epd/7.3.2/bin/python -e 'module purge' -e 'module load TACC' -e 'module load cluster' -e 'module load Linux' -e 'module load mvapich2' -e 'module load python/2.7.3-epd-7.3.2' -e 'module unload xalt' -e 'export TACC_KEEP_FILES=FALSE' -l SLURM -j SSH -k IBRUN -x luve"], 'Error': 'AGENT.STDERR', 'Output': 'AGENT.STDOUT', 'TotalCPUCount': 1}
2014:11:05 11:44:16 21037 PilotLauncherWorker-1 saga.SLURMJobService : [WARNING ] number_of_processes not specified in submitted SLURM job description -- defaulting to 1 per total_cpu_count! (1)
2014:11:05 11:44:16 21037 PilotLauncherWorker-1 saga.SLURMJobService : [INFO ] Creating working directory /work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084
2014:11:05 11:44:16 21037 PilotLauncherWorker-1 saga.SLURMJobService : [DEBUG ] run_sync: mkdir -p /work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] write: [ 17] [ 79] (mkdir -p /work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084\n)
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] read : [ 17] [ 10] (PROMPT-0->)
2014:11:05 11:44:16 21037 PilotLauncherWorker-1 saga.SLURMJobService : [INFO ] SLURM script generated:
#!/bin/sh
#SBATCH -J "SAGAPythonSLURMJob"
#SBATCH --ntasks=1
#SBATCH --cpus-per-task=1
#SBATCH -D /work/02998/ardi/radical.pilot.sandbox/pilot-545a0d7e54737d522db8e084
#SBATCH -o AGENT.STDOUT
#SBATCH -e AGENT.STDERR
#SBATCH -t 00:10:00
#SBATCH -p normal
#SBATCH -A TG-MCB090174
/bin/bash -l default_bootstrapper.sh -n radicalpilot -s 545a0d7e54737d522db8e082 -p 545a0d7e54737d522db8e084 -t 10 -d 10 -c 1 -v 0.21 -m ec2-184-72-89-141.compute-1.amazonaws.com:27017 -a : -i /opt/apps/python/epd/7.3.2/bin/python -e 'module purge' -e 'module load TACC' -e 'module load cluster' -e 'module load Linux' -e 'module load mvapich2' -e 'module load python/2.7.3-epd-7.3.2' -e 'module unload xalt' -e 'export TACC_KEEP_FILES=FALSE' -l SLURM -j SSH -k IBRUN -x luve
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] PTYProcess init <saga.utils.pty_process.PTYProcess object at 0x7f6a8c422c90>
2014:11:05 11:44:16 radical.pilot.MainProcess: [INFO ] running: /usr/bin/env TERM=vt100 /usr/bin/sftp -o IdentityFile=/users/ardita/.ssh/id_rsa -o ControlMaster=auto -o ControlPath=/tmp/saga_ssh_ardita_%h_%p.ardi.ctrl -o TCPKeepAlive=no -o ServerAliveInterval=10 -o ServerAliveCountMax=20 ardi@stampede.tacc.utexas.edu
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] write: [ 18] [ 33] ( export PS1='$' ; set prompt='$'\n)
2014:11:05 11:44:16 radical.pilot.MainProcess: [DEBUG ] read : [ 18] [ 43] (Connecting to stampede.tacc.utexas.edu...\n)
2014:11:05 11:44:17 radical.pilot.MainProcess: [DEBUG ] read : [ 18] [ 48] (Couldn't read packet: Connection reset by peer\n)
2014:11:05 11:44:17 radical.pilot.MainProcess: [DEBUG ] PTYProcess del <saga.utils.pty_process.PTYProcess object at 0x7f6a8c422c90>
2014:11:05 11:44:17 radical.pilot.MainProcess: [ERROR ] Pilot launching failed: read from process failed '[Errno 5] Input/output error' : (Connecting to stampede.tacc.utexas.edu...
Couldn't read packet: Connection reset by peer
) (/users/ardita/ExTASY-tools/lib/python2.7/site-packages/saga/utils/pty_process.py +643 (read) : % (e, self.tail)))
Traceback (most recent call last):
File "/users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/controller/pilot_launcher_worker.py", line 466, in run
pilotjob.run()
File "/users/ardita/ExTASY-tools/lib/python2.7/site-packages/saga/job/job.py", line 397, in run
return self._adaptor.run (ttype=ttype)
File "/users/ardita/ExTASY-tools/lib/python2.7/site-packages/saga/adaptors/cpi/decorators.py", line 51, in wrap_function
return sync_function (self, *args, **kwargs)
File "/users/ardita/ExTASY-tools/lib/python2.7/site-packages/saga/adaptors/slurm/slurm_job.py", line 1190, in run
self._id = self.js._job_run (self.jd)
File "/users/ardita/ExTASY-tools/lib/python2.7/site-packages/saga/adaptors/slurm/slurm_job.py", line 580, in _job_run
self.shell.stage_to_remote (src=fname, tgt=tgt)
File "/users/ardita/ExTASY-tools/lib/python2.7/site-packages/saga/utils/pty_shell.py", line 900, in stage_to_remote
raise ptye.translate_exception (e)
NoSuccess: read from process failed '[Errno 5] Input/output error' : (Connecting to stampede.tacc.utexas.edu...
Couldn't read packet: Connection reset by peer
) (/users/ardita/ExTASY-tools/lib/python2.7/site-packages/saga/utils/pty_process.py +643 (read) : % (e, self.tail)))
2014:11:05 11:44:17 radical.pilot.MainProcess: [INFO ] ComputePilot '545a0d7e54737d522db8e084' state changed from 'Launching' to 'Failed'.
[Callback]: ComputePilot '545a0d7e54737d522db8e084' state: Failed.
2014:11:05 11:44:17 radical.pilot.MainProcess: [ERROR ] pilot manager controller thread caught system exit -- forcing application shutdown
Traceback (most recent call last):
File "/users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/controller/pilot_manager_controller.py", line 293, in run
self.call_callbacks(pilot_id, new_state)
File "/users/ardita/ExTASY-tools/lib/python2.7/site-packages/radical/pilot/controller/pilot_manager_controller.py", line 211, in call_callbacks
cb(self._shared_data[pilot_id]['facade_object'](), new_state)
File "simple_bot.py", line 24, in pilot_state_cb
sys.exit (1)
SystemExit: 1
Execution was interrupted
Closing session, exiting now ...
2014:11:05 11:44:17 radical.pilot.MainProcess: [INFO ] Sent 'COMMAND_CANCEL_PILOT' command to all pilots.
2014:11:05 11:44:17 radical.pilot.MainProcess: [INFO ] Sent 'COMMAND_CANCEL_PILOT' command to all pilots.
2014:11:05 11:44:17 radical.pilot.MainProcess: [DEBUG ] Worker thread (ID: Thread-1[140095524960000]) for PilotManager 545a0d7e54737d522db8e083 stopped.
2014:11:05 11:44:17 radical.pilot.MainProcess: [INFO ] Closed PilotManager 545a0d7e54737d522db8e083.
2014:11:05 11:44:17 radical.pilot.MainProcess: [DEBUG ] UnitManager.close(): InputFileTransferWorker-1 terminated.
2014:11:05 11:44:17 radical.pilot.MainProcess: [DEBUG ] UnitManager.close(): InputFileTransferWorker-2 terminated.
2014:11:05 11:44:17 radical.pilot.MainProcess: [DEBUG ] UnitManager.close(): OutputFileTransferWorker-1 terminated.
2014:11:05 11:44:17 radical.pilot.MainProcess: [DEBUG ] UnitManager.close(): OutputFileTransferWorker-2 terminated.
2014:11:05 11:44:17 radical.pilot.MainProcess: [DEBUG ] Worker thread (ID: Thread-3[140095483000576]) for UnitManager 545a0d8554737d522db8e085 stopped.
2014:11:05 11:44:17 radical.pilot.MainProcess: [INFO ] Closed UnitManager 545a0d8554737d522db8e085.
2014:11:05 11:44:18 radical.pilot.MainProcess: [INFO ] Deleted session 545a0d7e54737d522db8e082 from database.
2014:11:05 11:44:18 radical.pilot.MainProcess: [INFO ] Closed Session 545a0d7e54737d522db8e082.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment