Difference between revisions of "RALnonLHCCVMFS"

From GridPP Wiki
Jump to: navigation, search
(pheno added)
(New page for configuring CernVM-FS support for small VOs under egi.eu domain)
Line 1: Line 1:
===CernmVM-FS Stratum 0 for non-LHC VOs ===
+
===CernVM-FS Stratum-0 for non-LHC VOs ===
  
RAL Tier-1 is currently hosting CernVM-FS Stratum-0 repositories for several non-LHC VOs.
+
<b>Note:</b> Old page (mainly related to <code>gridpp.ac.uk</code> CernVM-FS domain) can be found here.
  
<!--
 
{| border="1"
 
|+ Stratum-0 repos status at RAL
 
! VO !! Stratum-0 v2.0 !! Stratum-0 v2.1
 
|-
 
! mice
 
| yes || yes
 
|-
 
! na62
 
| yes
 
| yes
 
|-
 
! hone
 
| yes
 
| yes
 
|-
 
! phys.vo.ibergrid.eu
 
| yes
 
| yes
 
|-
 
! enmr.eu
 
| yes
 
| yes
 
|-
 
! hyperk.org
 
| no
 
| yes
 
|-
 
! t2k.org
 
| no
 
| yes
 
|-
 
! glast.org
 
| no
 
| yes
 
|-
 
! cernatschool.org
 
| no
 
| yes
 
|-
 
! biomed
 
| no
 
| yes
 
|-|}
 
-->
 
The Stratum-0 v2.1 repositories are replicated by the dedicated EGI CernVM-FS Stratum-1 server located at RAL.
 
  
==== Setting up a site to access small VOs CernVM-FS repositories located at RAL ====
+
RAL Tier-1 is currently hosting CernVM-FS Stratum-0 repositories for 14 non-LHC VOs (<code>auger, biomed, cernatschool.org, glast.org, hone, hyperk.org, km3net.org, mice, na62.vo.gridpp.ac.uk, pheno, phys.vo.ibergrid.eu, snoplus.snolab.ca, t2k.org, enmr.eu</code>).
  
Assuming that your site is already supporting CernVM-FS (client v2.1.X) for LHC VOs, these are the configurations needed to access the na62, mice, hone, wenmr, phys-ibergrid, hyperk, t2k, glast, cernatschool, biomed, snoplus and pheno CernVM-FS areas.
+
The Stratum-0 v2.1 repositories are replicated by dedicated EGI CernVM-FS Stratum-1 servers located at RAL, NIKHEF, ASGC and TRIUMF.
  
On each worker node create the <code>/etc/cvmfs/domain.d/gridpp.ac.uk.conf</code> file
+
At the moment (1 Nov 2014) the repositories hosted at RAL are advertised under two identical CernVM-FS spaces i.e. <code>/cvmfs/<repo_name>.gridpp.ac.uk</code> and <code>/cvmfs/<repo_name>.egi.eu</code>. Similarly the Stratum-1 services (except TRIUMF) are replicating both <code>gridpp.ac.uk</code> and <code>egi.eu</code> domains. The plan is to retire the <code>gridpp.ac.uk</code> domain by very early 2015.
  
<pre>
+
The following instructions will refer to configuration under <code>egi.eu</code> CernVM-FS domain.
[root@lcg0999 ~]# cat /etc/cvmfs/domain.d/gridpp.ac.uk.conf
+
CVMFS_PUBLIC_KEY=/etc/cvmfs/keys/gridpp.ac.uk.pub
+
CVMFS_SERVER_URL="http://cvmfs-egi.gridpp.rl.ac.uk:8000/cvmfs/@org@.gridpp.ac.uk;http://cvmfs01.nikhef.nl/cvmfs/@org@.gridpp.ac.uk"
+
</pre>
+
<!--
+
<pre>
+
[root@lcg1060 ~]# cat /etc/cvmfs/config.d/na62.gridpp.ac.uk.conf
+
CVMFS_PUBLIC_KEY=/etc/cvmfs/keys/gridpp.ac.uk.pub
+
CVMFS_SERVER_URL=http://cernvmfs.gridpp.rl.ac.uk:8000/opt/na62.gridpp.ac.uk
+
</pre>
+
<pre>
+
[root@lcg1060 ~]# cat /etc/cvmfs/config.d/hone.gridpp.ac.uk.conf
+
CVMFS_PUBLIC_KEY=/etc/cvmfs/keys/gridpp.ac.uk.pub
+
CVMFS_SERVER_URL=http://cernvmfs.gridpp.rl.ac.uk:8000/opt/hone.gridpp.ac.uk
+
</pre>
+
<pre>
+
[root@lcg1060 ~]# cat /etc/cvmfs/config.d/wenmr.gridpp.ac.uk.conf
+
CVMFS_PUBLIC_KEY=/etc/cvmfs/keys/gridpp.ac.uk.pub
+
CVMFS_SERVER_URL=http://cernvmfs.gridpp.rl.ac.uk:8000/opt/wenmr.gridpp.ac.uk
+
</pre>
+
-->
+
  
<b>Note 1:</b> If you're using puppet module to configure the CernVM-FS clients, then it is not possible to define 'gridpp.ac.uk' CernVM-FS domain for the time being. Only solution is configuration per VO. The table below shows the current status at UK sites.
+
==== Setting up a site to access small VOs CernVM-FS repositories located at RAL ====
  
{| border="1"
+
Assuming that your site is already supporting CernVM-FS (client v2.1.X) for LHC VOs, one only needs to update / install the <code>cvmfs-keys</code> package to the version 1.5-1 ([https://cvmrepo.web.cern.ch/cvmrepo/yum/cvmfs/EL/5/x86_64/cvmfs-keys-1.5-1.noarch.rpm]) on the worker nodes. This package adds the public keys and Stratum-1 server addresses for the <code>egi.eu</code> and <code>opensciencegrid.org</code> domains (no changes for the <code>*.cern.ch</code> repositories).
|+ CernVM-FS clients configuration at UK sites
+
! Site !! gridpp.ac.uk domain !! per VO configuration
+
|-
+
! RAL Tier-1
+
| yes ||
+
|-
+
! RALPP
+
|
+
| yes
+
|-
+
! ICL
+
| yes
+
|
+
|-
+
! LIV
+
|
+
|
+
|-
+
! QMUL
+
| yes(?)
+
|
+
|-
+
! OX
+
|
+
| yes
+
|-
+
! GLA
+
| yes
+
|
+
|-
+
! SHEF
+
| yes
+
|
+
|-
+
! MANC
+
|
+
| yes
+
|-
+
! DUR
+
| yes
+
|
+
|-
+
|}
+
  
<!--
+
In addition, one must specify in <code>/etc/cvmfs/default.local</code> the names of the local site squids:
<b>Note 2:</b> If your CernVM-FS clients are at v2.0.X, then on each WN, in <code>/etc/cvmfs/default.local</code> the <code>CVMFS_REPOSITORIES</code> variable should contain the names of the repositories i.e.
+
  
 
<pre>
 
<pre>
CVMFS_REPOSITORIES=atlas.cern.ch,atlas-condb.cern.ch,cms.cern.ch,lhcb.cern.ch,sft.cern.ch,
+
CVMFS_HTTP_PROXY="http://first.squid.domain:3128;http://second.squid.domain:3128"
    lhcb-conddb.cern.ch,mice.gridpp.ac.uk,na62.gridpp.ac.uk,hone.gridpp.ac.uk,
+
    wenmr.gridpp.ac.uk,phys-ibergrid.gridpp.ac.uk,hyperk.gridpp.ac.uk,t2k.gridpp.ac.uk,
+
    glast.gridpp.ac.uk,cernatschool.gridpp.ac.uk
+
 
</pre>
 
</pre>
  
The above mentioned setting is no longer needed on CernVM-FS clients v2.1.X if you have a configuration defined in <code>/etc/cvmfs/domain.d/</code>.
+
The last requirement is that the site local squid(s) are configured to use the Stratum-1 servers at RAL, NIKHEF, ASGC and TRIUMF (see <code>/etc/cvmfs/domain.d/egi.eu.conf</code> for the server names).
  
<b>Note 3:</b> Also if your CernVM-FS clients are at v2.0.X, they won't be able to access the Stratum-0 v2.1 repositories, therefore the <code>CVMFS_SERVER_URL</code> variable should look like
 
  
<pre>
+
Also do not forget to define the following environment variables on the batch farm:
CVMFS_SERVER_URL="http://cernvmfs.gridpp.rl.ac.uk:8000/opt/@org@.gridpp.ac.uk;http://cvmfs-stratum-one.cern.ch:8000/opt/@org@.gridpp.ac.uk"
+
</pre>
+
  
in <code>/etc/cvmfs/domain.d/gridpp.ac.uk.conf</code> file. As a consequence the glast, t2k, hyperk and cernatschool repositories will not be accessible by CernVM-FS clients v2.0 (upgrade to v2.1.X is recommended).
+
<b>Note:</b> you need to define the following variables only if your site supports the specific VO.
-->
+
 
+
In addition, also in <code>/etc/cvmfs/default.local</code> you must specify the names of your local site squids (regardless CernVM-FS client version):
+
  
 +
- for auger users/jobs
 
<pre>
 
<pre>
CVMFS_HTTP_PROXY="http://first.squid.domain:3128;http://second.squid.domain:3128"
+
VO_AUGER_SW_DIR=/cvmfs/auger.egi.eu
 
</pre>
 
</pre>
  
We use a single pair of masterkeys (private and public) to sign all cvmfs stratum-0 repositories and the public key is needed for authentication at cvmfs client level.
+
- for biomed users/jobs
 
+
 
<pre>
 
<pre>
[root@lcg1060 ~]# cat /etc/cvmfs/keys/gridpp.ac.uk.pub
+
VO_BIOMED_SW_DIR=/cvmfs/biomed.egi.eu
-----BEGIN PUBLIC KEY-----
+
MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAp7C4KDvOIEVJepuAHjxE
+
EES1sDdohz0hiU6uvSqxVYjKVR4Y4/0I/D/zLijQI+MHR7859RN0/6fsZ3b3At3l
+
UbvNfqq6DN1zVjjd0xagC6SMBhSfj/iQKQSsG8MXSyiNmM8YalVHJSPqoova6CPE
+
EgLEjnHKTNEogTNjKBwbP2ELPLkfVoNoxxrXPSox7aln8JdgyZzZlBwm98gnFa1v
+
JTVAl0HQnUJ6cjMwO31wIGVMdvZ+P962t+2bPGfOCm6Ly6BusXcLoIIeez5SBerB
+
aHz//NSTZDbHVNPEqpoo1AQVVOo4XJmqo64jBa3G4Dr0zSda1bkZMVhsyUtjhfEB
+
DwIDAQAB
+
-----END PUBLIC KEY-----
+
 
</pre>
 
</pre>
  
A RPM installing the public key and the gridpp.ac.uk.conf file is available and can be provided on request (catalin.condurache@stfc.ac.uk)
+
- for cernatschool.org users/jobs
<!-- It will be made available shortly with a RPM package. -->
+
 
+
The last requirement is that the site local squid(s) are configured to use the stratum-1 server at RAL.
+
 
+
Also do not forget to define the following environment variables:
+
 
+
- for mice users/jobs
+
 
<pre>
 
<pre>
VO_MICE_SW_DIR=/cvmfs/mice.gridpp.ac.uk
+
VO_CERNATSCHOOL_ORG_SW_DIR=/cvmfs/cernatschool.egi.eu
 
</pre>
 
</pre>
  
- for na62 users/jobs
+
- for glast.org users/jobs
 
<pre>
 
<pre>
VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.gridpp.ac.uk
+
VO_GLAST_ORG_SW_DIR=/cvmfs/glast.egi.eu
 
</pre>
 
</pre>
  
- for hone users/jobs
+
- for hyperk.org users/jobs
 
<pre>
 
<pre>
VO_HONE_SW_DIR=/cvmfs/hone.gridpp.ac.uk
+
VO_HYPERK_ORG_SW_DIR=/cvmfs/hyperk.egi.eu
 
</pre>
 
</pre>
  
- for enmr.eu users/jobs
+
- for km3net.org users/jobs
 
<pre>
 
<pre>
VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.gridpp.ac.uk
+
VO_KM3NET_ORG_SW_DIR=/cvmfs/km3net.egi.eu
 
</pre>
 
</pre>
  
- for phys.vo.ibergrid.eu users/jobs
+
- for mice users/jobs
 
<pre>
 
<pre>
VO_PHYS_VO_IBERGRID_EU_SW_DIR=/cvmfs/phys-ibergrid.gridpp.ac.uk
+
VO_MICE_SW_DIR=/cvmfs/mice.egi.eu
 
</pre>
 
</pre>
  
- for hyperk.org users/jobs
+
- for na62 users/jobs
 
<pre>
 
<pre>
VO_HYPERK_ORG_SW_DIR=/cvmfs/hyperk.gridpp.ac.uk
+
VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.egi.eu
 
</pre>
 
</pre>
  
- for t2k.org users/jobs
+
- for pheno users/jobs
 
<pre>
 
<pre>
VO_T2K_ORG_SW_DIR=/cvmfs/t2k.gridpp.ac.uk
+
VO_PHENO_SW_DIR=/cvmfs/pheno.egi.eu
 
</pre>
 
</pre>
  
- for cernatschool.org users/jobs
+
- for phys.vo.ibergrid.eu users/jobs
 
<pre>
 
<pre>
VO_CERNATSCHOOL_ORG_SW_DIR=/cvmfs/cernatschool.gridpp.ac.uk
+
VO_PHYS_VO_IBERGRID_EU_SW_DIR=/cvmfs/phys-ibergrid.egi.eu
 
</pre>
 
</pre>
  
- for glast.org users/jobs
+
- for snoplus.snolab.ca users/jobs
 
<pre>
 
<pre>
VO_GLAST_ORG_SW_DIR=/cvmfs/glast.gridpp.ac.uk
+
VO_SNOPLUS_SNOLAB_CA_SW_DIR=/cvmfs/snoplus.egi.eu
 
</pre>
 
</pre>
  
- for snoplus.snolab.ca users/jobs
+
- for t2k.org users/jobs
 
<pre>
 
<pre>
VO_SNOPLUS_SNOLAB_CA_SW_DIR=/cvmfs/snoplus.gridpp.ac.uk
+
VO_T2K_ORG_SW_DIR=/cvmfs/t2k.egi.eu
 
</pre>
 
</pre>
  
- for pheno users/jobs
+
- for enmr.eu users/jobs
 
<pre>
 
<pre>
VO_PHENO_SW_DIR=/cvmfs/pheno.gridpp.ac.uk
+
VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu
 
</pre>
 
</pre>

Revision as of 14:57, 4 November 2014

CernVM-FS Stratum-0 for non-LHC VOs

Note: Old page (mainly related to gridpp.ac.uk CernVM-FS domain) can be found here.


RAL Tier-1 is currently hosting CernVM-FS Stratum-0 repositories for 14 non-LHC VOs (auger, biomed, cernatschool.org, glast.org, hone, hyperk.org, km3net.org, mice, na62.vo.gridpp.ac.uk, pheno, phys.vo.ibergrid.eu, snoplus.snolab.ca, t2k.org, enmr.eu).

The Stratum-0 v2.1 repositories are replicated by dedicated EGI CernVM-FS Stratum-1 servers located at RAL, NIKHEF, ASGC and TRIUMF.

At the moment (1 Nov 2014) the repositories hosted at RAL are advertised under two identical CernVM-FS spaces i.e. /cvmfs/<repo_name>.gridpp.ac.uk and /cvmfs/<repo_name>.egi.eu. Similarly the Stratum-1 services (except TRIUMF) are replicating both gridpp.ac.uk and egi.eu domains. The plan is to retire the gridpp.ac.uk domain by very early 2015.

The following instructions will refer to configuration under egi.eu CernVM-FS domain.

Setting up a site to access small VOs CernVM-FS repositories located at RAL

Assuming that your site is already supporting CernVM-FS (client v2.1.X) for LHC VOs, one only needs to update / install the cvmfs-keys package to the version 1.5-1 ([1]) on the worker nodes. This package adds the public keys and Stratum-1 server addresses for the egi.eu and opensciencegrid.org domains (no changes for the *.cern.ch repositories).

In addition, one must specify in /etc/cvmfs/default.local the names of the local site squids:

CVMFS_HTTP_PROXY="http://first.squid.domain:3128;http://second.squid.domain:3128"

The last requirement is that the site local squid(s) are configured to use the Stratum-1 servers at RAL, NIKHEF, ASGC and TRIUMF (see /etc/cvmfs/domain.d/egi.eu.conf for the server names).


Also do not forget to define the following environment variables on the batch farm:

Note: you need to define the following variables only if your site supports the specific VO.

- for auger users/jobs

VO_AUGER_SW_DIR=/cvmfs/auger.egi.eu

- for biomed users/jobs

VO_BIOMED_SW_DIR=/cvmfs/biomed.egi.eu

- for cernatschool.org users/jobs

VO_CERNATSCHOOL_ORG_SW_DIR=/cvmfs/cernatschool.egi.eu

- for glast.org users/jobs

VO_GLAST_ORG_SW_DIR=/cvmfs/glast.egi.eu

- for hyperk.org users/jobs

VO_HYPERK_ORG_SW_DIR=/cvmfs/hyperk.egi.eu

- for km3net.org users/jobs

VO_KM3NET_ORG_SW_DIR=/cvmfs/km3net.egi.eu

- for mice users/jobs

VO_MICE_SW_DIR=/cvmfs/mice.egi.eu

- for na62 users/jobs

VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.egi.eu

- for pheno users/jobs

VO_PHENO_SW_DIR=/cvmfs/pheno.egi.eu

- for phys.vo.ibergrid.eu users/jobs

VO_PHYS_VO_IBERGRID_EU_SW_DIR=/cvmfs/phys-ibergrid.egi.eu

- for snoplus.snolab.ca users/jobs

VO_SNOPLUS_SNOLAB_CA_SW_DIR=/cvmfs/snoplus.egi.eu

- for t2k.org users/jobs

VO_T2K_ORG_SW_DIR=/cvmfs/t2k.egi.eu

- for enmr.eu users/jobs

VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu