Toby Marthews
  • Home
  • Opportunities
  • Projects
    • Inundation >
      • Flooded vegetation
      • Hydro-JULES
      • REP
      • Gather
    • MOCABORS
    • CSSP Brazil
    • Completed projects >
      • HydroSOS
      • Earth2Observe
      • Projects pre-2016
  • Publications
  • Map-based CV
  • Resources
    • Hydro-JULES on JASMIN
    • 1D2D
    • Animate data
    • Wytham Maps
  • Gallery
  • Welcome to UKCEH

Jasmin set-up

When accessing JASMIN, I usually do it via another server (call this ServerA), i.e. I log in first to ServerA and then ssh from ServerA to JASMIN.

WARNINGS:
     (1) Search-replace "tmarthews" below with your JASMIN username and
     (2) the extra .bashrc lines on JASMIN are specific to MY setup (e.g. I store my model code in a directory ~/MODELS/ , but you can put them elsewhere): you will need to modify it appropriately for your setup.
     (3) Don't forget to make sure your public/private key has been set up correctly on ServerA, which involves:
          (i) Saving your private key file on ServerA in directory ~/.ssh/ (see here) and you have set the file to have read-only permissions (e.g. with chmod 400 ~/.ssh/* ) and
          (ii) Your corresponding private key has been uploaded to your Jasmin profile here.


I find it useful to add the following lines to the .bashrc on ServerA:
#For JASMIN:
alias jlogin='ssh -A -X tmarthews@login1.jasmin.ac.uk'

...and the following lines to the .bashrc on my JASMIN profile (you can save it on the login server or the main compute machines: it's the same):
# .bashrc

# Source global definitions
if [ -f /etc/bashrc ]; then
    . /etc/bashrc
fi

alias jcylc='ssh -A -X cylc.jasmin.ac.uk'
alias jsci1='ssh -A -X sci1.jasmin.ac.uk'
alias jsci2='ssh -A -X sci2.jasmin.ac.uk'
alias jsci3='ssh -A -X sci3.jasmin.ac.uk'
alias jsci4='ssh -A -X sci4.jasmin.ac.uk'
alias jsci5='ssh -A -X sci5.jasmin.ac.uk'
alias jsci6='ssh -A -X sci6.jasmin.ac.uk'

[[ $(hostname) = "login1.jasmin.ac.uk" ]] && return
[[ $(hostname) = "login2.jasmin.ac.uk" ]] && return

#Asynchronous notification of background jobs completion
set -o notify

#I've commented out these because the location /apps/contrib/metomi/ no longer exists on JASMIN:
# Provide access to FCM, Rose and Cylc
#export PATH=$PATH:/apps/contrib/metomi/bin
# Enable bash completion for Rose commands
#[[ -f /apps/contrib/metomi/rose/etc/rose-bash-completion ]] && . /apps/contrib/metomi/rose/etc/rose-bash-completion

export PATH=$HOME/.local/cylc/bin:$PATH
export PATH=$HOME/.local/rose/bin:$PATH
export PATH=$HOME/.local/fcm/bin:$PATH
export PATH=$HOME/.local:$PATH

# Password caching. MetO advice from https://code.metoffice.gov.uk/trac/home/wiki/AuthenticationCaching/GpgAgent
[[ $- != *i* ]] && return # Stop here if not running interactively

[[ $(hostname) = "sci1.jasmin.ac.uk" || $(hostname) = "sci2.jasmin.ac.uk" || $(hostname) = "sci3.jasmin.ac.uk" || $(hostname) = "sci4.jasmin.ac.uk" || $(hostname) = "sci5.jasmin.ac.uk" || $(hostname) = "sci6.jasmin.ac.uk" || $(hostname) = "cylc1.jasmin.ac.uk" ]] && . mosrs-setup-gpg-agent


##If you are asked for your MOSRS password by the following command then
##you have NOT cached your password correctly
#rosie hello
#echo ''
#echo 'Keywords stored on your system for FCM:'
#fcm keyword-print



#See https://help.jasmin.ac.uk/article/176-storage
echo ""
echo "Current disk usage (max: 100 Gb):"
pdu -sh /home/users/tmarthews/
echo "If you see an error about the locking authority file .Xauthority then you have most likely exceeded your home directory disk quota"

#See https://help.jasmin.ac.uk/article/4700-understanding-new-jasmin-storage
echo "Your scratch space is at /work/scratch-pw/tmarthews (now in $SCR ; files older than 28 days are purged weekly)."
export SCR="/work/scratch-pw/tmarthews"
#Don't do this: pdu -sh /work/scratch-pw/tmarthews/
#because it just produces an error "failed to connect to panfs:111 (1): Socket failed with POLLHUP".

echo ""
echo "Here are the groups in which you are included:"
groups

#From Emma R:
#ncmore(){ ncdump $* | more ; }

#(xxdiff not installed and ediff didn't work so) set up tkdiff using module load jasmin-sci if I need it

#For NCO and GDAL tools you can just load: module load jaspy
#For R: module load jasr




##I usually use only one Rose suite at a time, so it makes sense to define an environment variable $RSUITE pointing to it
export RSUITE=$HOME/roses/u-cr235


#Set $JULES_ROOT to be the JULES source set in $RSUITE (i.e. equal to $JULES_SOURCE as used inside the suite)
export tmp1=`grep -ir "JULES_SOURCE" $RSUITE/app/fcm_make/rose-app.conf`
export JULES_ROOT=`echo ${tmp1##*=}`
unset tmp1
#Set $OUTPUT_DIR to be the location specified in $RSUITE for output files
export tmp1=`grep -ir "output_dir" $RSUITE/app/jules/rose-app.conf`
export OUTPUT_DIR=`echo ${tmp1##*=} | sed "s/'//g"`
unset tmp1
#Set $CSUITE to be the location used for runtime files during Rose runs of $RSUITE
export CSUITE=$HOME/cylc-run/${RSUITE##*/}
#I also find it useful to have a set location for any namelists extracted from $RSUITE
export NAMELIST=$HOME/roses/nlists_${RSUITE##*/}

echo ''
echo 'Environment variables have now been defined:'
echo '  $RSUITE =' $RSUITE
echo '  $JULES_ROOT =' $JULES_ROOT
echo '  $OUTPUT_DIR =' $OUTPUT_DIR
echo '  $NAMELIST =' $NAMELIST
echo '  $CSUITE =' $CSUITE
echo ''

echo 'The following aliases have also now been set up:'
export LOGSJ=$CSUITE/log/job/1/jules/NN
export LOGSF=$CSUITE/log/job/1/fcm_make/NN
alias nnl="geany $LOGSF/job.out $LOGSF/job.err $LOGSJ/job.out $LOGSJ/job.err &"
alias nnr="geany $RSUITE/suite.rc $RSUITE/app/fcm_make/rose-app.conf $RSUITE/app/jules/rose-app.conf &"
#type nnl
#type nnr

echo''
echo '  You can open the logs of your most recent JULES run (and its compilation) using nnl'
# (but if you have many runs with different suites, you might want to use Rose Bush)
echo '  Use nnr to edit '${RSUITE##*/}' textfiles directly (the fcm_make and jules parts come up in separate tabs)'

##The following command opens my current Rose suite in Rose Edit.
##This is what I use because I generally only edit one suite at a time. However, if you have several suites and store them all in ~/roses/ then you might want to replace this line with the command "rosie go &" instead (unfortunately, Rosie Go will only look in ~/roses/ for rose suites, not elsewhere).
alias rr="rose edit -C $RSUITE &"
echo '  Open Rose Edit to edit suite' ${RSUITE##*/} 'using rr'

#Rose bush (commented out because not installed on JASMIN)
#alias rb="rose slv --name=${RSUITE##*/} &"
#echo '  Open Rose Bush to look at previous runs of' ${RSUITE##*/} 'using rb'

#Kill command:
alias kk="cylc stop --kill ${RSUITE##*/}"
echo '  Kill any running job of' ${RSUITE##*/} 'using kk'



#I don't use these because I prefer to access jules.exe and create_rose_app using absolute paths:
#export PATH=$JULES_ROOT/build/bin/:$JULES_ROOT/bin/:$PATH




All this means that whenever I want to access JASMIN, (1) I access ServerA, (2) I type in:

exec ssh-agent $SHELL
ssh-add ~/.ssh/id_rsa_jasmin


which accesses my key , then (3) I type jlogin to get to the login server and, finally, (4) I type jcylc to get to the compute machine I use for JULES.
Land Surface Science group, UK Centre for Ecology & Hydrology (UKCEH),
MacLean Bdg, Wallingford OX10 8BB, U.K.
, tobmar *a,t* ceh *dot* ac *dot* uk
Proudly powered by Weebly
  • Home
  • Opportunities
  • Projects
    • Inundation >
      • Flooded vegetation
      • Hydro-JULES
      • REP
      • Gather
    • MOCABORS
    • CSSP Brazil
    • Completed projects >
      • HydroSOS
      • Earth2Observe
      • Projects pre-2016
  • Publications
  • Map-based CV
  • Resources
    • Hydro-JULES on JASMIN
    • 1D2D
    • Animate data
    • Wytham Maps
  • Gallery
  • Welcome to UKCEH