Difference between revisions of "RALnonLHCCVMFS"

From GridPP Wiki
Jump to: navigation, search
m
(List of hosted repositories updated)
 
(11 intermediate revisions by 2 users not shown)
Line 1: Line 1:
===CernmVM-FS Stratum 0 for non-LHC VOs ===
+
===CernVM-FS Stratum-0 for non-LHC VOs ===
  
RAL Tier1 is currently hosting CernVM-FS Stratum-0 repositories for several non-LHC VOs.
+
<b>Note:</b> Old page (mainly related to <code>gridpp.ac.uk</code> CernVM-FS domain) can be found here ([https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS_gridpp_ac_uk]).
  
{| border="1"
 
|+ Stratum-0 repos status at RAL
 
! VO !! Stratum-0 v2.0 !! Stratum-0 v2.1
 
|-
 
! mice
 
| yes || yes
 
|-
 
! na62
 
| yes
 
| yes
 
|-
 
! hone
 
| yes
 
| yes
 
|-
 
! phys.vo.ibergrid.eu
 
| yes
 
| yes
 
|-
 
! enmr.eu
 
| yes
 
| yes
 
|-
 
! hyperk.org
 
| no
 
| yes
 
|-
 
! t2k.org
 
| no
 
| yes
 
|-
 
! glast.org
 
| no
 
| yes
 
|-
 
! cernatschool.org
 
| no
 
| yes
 
|-
 
|}
 
  
 +
RAL Tier-1 is currently maintaining two CernVM-FS domains (<code>egi.eu</code> and <code>gridpp.ac.uk</code>) that include 24 CernVM-FS Stratum-0 repositories for non-LHC VOs and projects (<code>auger, biomed, cernatschool.org, chipster.csc.fi, dirac, galdyn, comet.j-parc.jp, glast.org, hyperk.org, km3net.org, ligo, vo.londongrid.ac.uk, lucid, mice, na62.vo.gridpp.ac.uk, vo.northgrid.ac.uk, pheno, phys.vo.ibergrid.eu, vo.scotgrid.ac.uk, snoplus.snolab.ca, vo.southgrid.ac.uk, t2k.org, enmr.eu</code>).
  
Stratum-0 v2.0 repositories are replicated on the local LHC Stratum-1 server. Stratum-0 v2.1 repositories are replicated by the dedicated EGI CernVM-FS Stratum-1 server located at RAL.
+
The Stratum-0 v2.1 repositories are replicated by dedicated EGI CernVM-FS Stratum-1 servers located at RAL, NIKHEF, ASGC and TRIUMF.
 +
 
 +
<!-- At the moment (1 August 2015) the repositories hosted at RAL are published under two identical CernVM-FS spaces i.e. <code>/cvmfs/<repo_name>.gridpp.ac.uk</code> and <code>/cvmfs/<repo_name>.egi.eu</code>. Similarly the Stratum-1 services (except TRIUMF) are replicating both <code>gridpp.ac.uk</code> and <code>egi.eu</code> domains. The plan is to retire the <code>gridpp.ac.uk</code> domain by very early 2015.
 +
-->
 +
 
 +
The following instructions will refer to configuration under <code>egi.eu</code> CernVM-FS domain.
  
 
==== Setting up a site to access small VOs CernVM-FS repositories located at RAL ====
 
==== Setting up a site to access small VOs CernVM-FS repositories located at RAL ====
  
Assuming that your site is already supporting CernVM-FS (client v2.1.X) for LHC VOs, these are the configurations needed to access the na62, mice, hone, wenmr, phys-ibergrid, hyperk, t2k, glast and cernatschool CernVM-FS areas.
+
Assuming that your site is already supporting CernVM-FS (client v2.1.X) for LHC VOs, one only needs to update / install the <code>cvmfs-keys</code> package to the version 1.5-1 ([https://cvmrepo.web.cern.ch/cvmrepo/yum/cvmfs/EL/5/x86_64/cvmfs-keys-1.5-1.noarch.rpm]) on the worker nodes. This package adds the public keys and Stratum-1 server addresses for the <code>egi.eu</code> and <code>opensciencegrid.org</code> domains (no changes for the <code>*.cern.ch</code> repositories).
  
On each worker node create the <code>/etc/cvmfs/domain.d/gridpp.ac.uk.conf</code> file
+
In addition, one must specify in <code>/etc/cvmfs/default.local</code> the names of the local site squids:
  
 
<pre>
 
<pre>
[root@lcg0999 ~]# cat /etc/cvmfs/domain.d/gridpp.ac.uk.conf
+
CVMFS_HTTP_PROXY="http://first.squid.domain:3128;http://second.squid.domain:3128"
CVMFS_PUBLIC_KEY=/etc/cvmfs/keys/gridpp.ac.uk.pub
+
CVMFS_SERVER_URL="http://cvmfs-egi.gridpp.rl.ac.uk:8000/cvmfs/@org@.gridpp.ac.uk;http://cvmfs01.nikhef.nl/cvmfs/@org@.gridpp.ac.uk"
+
 
</pre>
 
</pre>
<!--
+
 
 +
The last requirement is that the site local squid(s) are configured to use the Stratum-1 servers at RAL, NIKHEF, ASGC and TRIUMF (see <code>/etc/cvmfs/domain.d/egi.eu.conf</code> for the server names).
 +
 
 +
Also do not forget to define the following environment variables on the batch farm (Note - you need to define them only if your site supports the specific VO):
 +
 
 +
- for <code>auger</code> users/jobs
 
<pre>
 
<pre>
[root@lcg1060 ~]# cat /etc/cvmfs/config.d/na62.gridpp.ac.uk.conf
+
VO_AUGER_SW_DIR=/cvmfs/auger.egi.eu
CVMFS_PUBLIC_KEY=/etc/cvmfs/keys/gridpp.ac.uk.pub
+
CVMFS_SERVER_URL=http://cernvmfs.gridpp.rl.ac.uk:8000/opt/na62.gridpp.ac.uk
+
 
</pre>
 
</pre>
 +
 +
- for <code>biomed</code> users/jobs
 
<pre>
 
<pre>
[root@lcg1060 ~]# cat /etc/cvmfs/config.d/hone.gridpp.ac.uk.conf
+
VO_BIOMED_SW_DIR=/cvmfs/biomed.egi.eu
CVMFS_PUBLIC_KEY=/etc/cvmfs/keys/gridpp.ac.uk.pub
+
CVMFS_SERVER_URL=http://cernvmfs.gridpp.rl.ac.uk:8000/opt/hone.gridpp.ac.uk
+
 
</pre>
 
</pre>
 +
 +
- for <code>cernatschool.org</code> users/jobs
 
<pre>
 
<pre>
[root@lcg1060 ~]# cat /etc/cvmfs/config.d/wenmr.gridpp.ac.uk.conf
+
VO_CERNATSCHOOL_ORG_SW_DIR=/cvmfs/cernatschool.egi.eu
CVMFS_PUBLIC_KEY=/etc/cvmfs/keys/gridpp.ac.uk.pub
+
CVMFS_SERVER_URL=http://cernvmfs.gridpp.rl.ac.uk:8000/opt/wenmr.gridpp.ac.uk
+
 
</pre>
 
</pre>
-->
 
  
<b>Note 1:</b> If you're using puppet module to configure the CernVM-FS clients, then it is not possible to define 'gridpp.ac.uk' CernVM-FS domain for the time being. Only solution is configuration per VO. The table below shows the current status at UK sites.
+
- for <code>comet.j-parc.jp</code> users/jobs
 
+
<pre>
{| border="1"
+
VO_COMET_J-PARC_JP_SW_DIR=/cvmfs/comet.egi.eu
|+ CernVM-FS clients configuration at UK sites
+
</pre>
! Site !! gridpp.ac.uk domain !! per VO configuration
+
|-
+
! RAL Tier-1
+
| yes ||
+
|-
+
! RALPP
+
|
+
| yes
+
|-
+
! ICL
+
| yes
+
|
+
|-
+
! LIV
+
|
+
|
+
|-
+
! QMUL
+
| yes(?)
+
|
+
|-
+
! OX
+
|
+
| yes
+
|-
+
! GLA
+
| yes
+
|
+
|-
+
! SHEF
+
|
+
|
+
|-
+
! MANC
+
|
+
| yes
+
|-
+
|}
+
 
+
<b>Note 2:</b> If your CernVM-FS clients are at v2.0.X, then on each WN, in <code>/etc/cvmfs/default.local</code> the <code>CVMFS_REPOSITORIES</code> variable should contain the names of the repositories i.e.
+
  
 +
- for <code>glast.org</code> users/jobs
 
<pre>
 
<pre>
CVMFS_REPOSITORIES=atlas.cern.ch,atlas-condb.cern.ch,cms.cern.ch,lhcb.cern.ch,sft.cern.ch,
+
VO_GLAST_ORG_SW_DIR=/cvmfs/glast.egi.eu
    lhcb-conddb.cern.ch,mice.gridpp.ac.uk,na62.gridpp.ac.uk,hone.gridpp.ac.uk,
+
    wenmr.gridpp.ac.uk,phys-ibergrid.gridpp.ac.uk,hyperk.gridpp.ac.uk,t2k.gridpp.ac.uk,
+
    glast.gridpp.ac.uk,cernatschool.gridpp.ac.uk
+
 
</pre>
 
</pre>
  
The above mentioned setting is no longer needed on CernVM-FS clients v2.1.X if you have a configuration defined in <code>/etc/cvmfs/domain.d/</code>.
+
- for <code>hyperk.org</code> users/jobs
 
+
<b>Note 3:</b> Also if your CernVM-FS clients are at v2.0.X, they won't be able to access the Stratum-0 v2.1 repositories, therefore the <code>CVMFS_SERVER_URL</code> variable should look like
+
 
+
 
<pre>
 
<pre>
CVMFS_SERVER_URL="http://cernvmfs.gridpp.rl.ac.uk:8000/opt/@org@.gridpp.ac.uk;http://cvmfs-stratum-one.cern.ch:8000/opt/@org@.gridpp.ac.uk"
+
VO_HYPERK_ORG_SW_DIR=/cvmfs/hyperk.egi.eu
 
</pre>
 
</pre>
  
in <code>/etc/cvmfs/domain.d/gridpp.ac.uk.conf</code> file. As a consequence the glast, t2k, hyperk and cernatschool repositories will not be accessible by CernVM-FS clients v2.0 (upgrade to v2.1.X is recommended).
+
- for <code>km3net.org</code> users/jobs
 
+
 
+
In addition, also in <code>/etc/cvmfs/default.local</code> you must specify the names of your local site squids (regardless CernVM-FS client version):
+
 
+
 
<pre>
 
<pre>
CVMFS_HTTP_PROXY="http://first.squid.domain:3128;http://second.squid.domain:3128"
+
VO_KM3NET_ORG_SW_DIR=/cvmfs/km3net.egi.eu
 
</pre>
 
</pre>
  
We use a single pair of masterkeys (private and public) to sign all cvmfs stratum-0 repositories and the public key is needed for authentication at cvmfs client level.
+
- for <code>mice</code> users/jobs
 
+
 
<pre>
 
<pre>
[root@lcg1060 ~]# cat /etc/cvmfs/keys/gridpp.ac.uk.pub
+
VO_MICE_SW_DIR=/cvmfs/mice.egi.eu
-----BEGIN PUBLIC KEY-----
+
MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAp7C4KDvOIEVJepuAHjxE
+
EES1sDdohz0hiU6uvSqxVYjKVR4Y4/0I/D/zLijQI+MHR7859RN0/6fsZ3b3At3l
+
UbvNfqq6DN1zVjjd0xagC6SMBhSfj/iQKQSsG8MXSyiNmM8YalVHJSPqoova6CPE
+
EgLEjnHKTNEogTNjKBwbP2ELPLkfVoNoxxrXPSox7aln8JdgyZzZlBwm98gnFa1v
+
JTVAl0HQnUJ6cjMwO31wIGVMdvZ+P962t+2bPGfOCm6Ly6BusXcLoIIeez5SBerB
+
aHz//NSTZDbHVNPEqpoo1AQVVOo4XJmqo64jBa3G4Dr0zSda1bkZMVhsyUtjhfEB
+
DwIDAQAB
+
-----END PUBLIC KEY-----
+
 
</pre>
 
</pre>
  
A RPM installing the public key and the gridpp.ac.uk.conf file is available and can be provided at request (catalin.condurache@stfc.ac.uk)
+
- for <code>na62</code> users/jobs
<!-- It will be made available shortly with a RPM package. -->
+
<pre>
 
+
VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.egi.eu
The last requirement is that the site local squid(s) are configured to use the stratum-1 server at RAL.
+
</pre>
 
+
Also do not forget to define the following environment variables:
+
  
- for mice users/jobs
+
- for <code>pheno</code> users/jobs
 
<pre>
 
<pre>
VO_MICE_SW_DIR=/cvmfs/mice.gridpp.ac.uk
+
VO_PHENO_SW_DIR=/cvmfs/pheno.egi.eu
 
</pre>
 
</pre>
  
- for na62 users/jobs
+
- for <code>phys.vo.ibergrid.eu</code> users/jobs
 
<pre>
 
<pre>
VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.gridpp.ac.uk
+
VO_PHYS_VO_IBERGRID_EU_SW_DIR=/cvmfs/phys-ibergrid.egi.eu
 
</pre>
 
</pre>
  
- for hone users/jobs
+
- for <code>snoplus.snolab.ca</code> users/jobs
 
<pre>
 
<pre>
VO_HONE_SW_DIR=/cvmfs/hone.gridpp.ac.uk
+
VO_SNOPLUS_SNOLAB_CA_SW_DIR=/cvmfs/snoplus.egi.eu
 
</pre>
 
</pre>
  
- for enmr.eu users/jobs
+
- for <code>t2k.org</code> users/jobs
 
<pre>
 
<pre>
VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.gridpp.ac.uk
+
VO_T2K_ORG_SW_DIR=/cvmfs/t2k.egi.eu
 
</pre>
 
</pre>
  
- for phys.vo.ibergrid.eu users/jobs
+
- for <code>enmr.eu</code> users/jobs
 
<pre>
 
<pre>
VO_PHYS_VO_IBERGRID_EU_SW_DIR=/cvmfs/phys-ibergrid.gridpp.ac.uk
+
VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu
 
</pre>
 
</pre>
  
- for hyperk.org users/jobs
+
- for <code>vo.northgrid.ac.uk</code> users/jobs
 
<pre>
 
<pre>
VO_HYPERK_ORG_SW_DIR=/cvmfs/hyperk.gridpp.ac.uk
+
VO_VO_NORTHGRID_AC_UK_SW_DIR=/cvmfs/northgrid.gridpp.ac.uk
 
</pre>
 
</pre>
  
- for t2k.org users/jobs
+
- for <code>vo.southgrid.ac.uk</code> users/jobs
 
<pre>
 
<pre>
VO_T2K_ORG_SW_DIR=/cvmfs/t2k.gridpp.ac.uk
+
VO_VO_SOUTHGRID_AC_UK_SW_DIR=/cvmfs/southgrid.gridpp.ac.uk
 
</pre>
 
</pre>
  
- for cernatschool.org users/jobs
+
- for <code>vo.londongrid.ac.uk</code> users/jobs
 
<pre>
 
<pre>
VO_CERNATSCHOOL_ORG_SW_DIR=/cvmfs/cernatschool.gridpp.ac.uk
+
VO_VO_LONDONGRID_AC_UK_SW_DIR=/cvmfs/londongrid.gridpp.ac.uk
 
</pre>
 
</pre>
  
- for glast.org users/jobs
+
- for <code>vo.scotgrid.ac.uk</code> users/jobs
 
<pre>
 
<pre>
VO_GLAST_ORG_SW_DIR=/cvmfs/glast.gridpp.ac.uk
+
VO_VO_SCOTGRID_AC_UK_SW_DIR=/cvmfs/scotgrid.gridpp.ac.uk
 
</pre>
 
</pre>

Latest revision as of 14:30, 6 August 2015

CernVM-FS Stratum-0 for non-LHC VOs

Note: Old page (mainly related to gridpp.ac.uk CernVM-FS domain) can be found here ([1]).


RAL Tier-1 is currently maintaining two CernVM-FS domains (egi.eu and gridpp.ac.uk) that include 24 CernVM-FS Stratum-0 repositories for non-LHC VOs and projects (auger, biomed, cernatschool.org, chipster.csc.fi, dirac, galdyn, comet.j-parc.jp, glast.org, hyperk.org, km3net.org, ligo, vo.londongrid.ac.uk, lucid, mice, na62.vo.gridpp.ac.uk, vo.northgrid.ac.uk, pheno, phys.vo.ibergrid.eu, vo.scotgrid.ac.uk, snoplus.snolab.ca, vo.southgrid.ac.uk, t2k.org, enmr.eu).

The Stratum-0 v2.1 repositories are replicated by dedicated EGI CernVM-FS Stratum-1 servers located at RAL, NIKHEF, ASGC and TRIUMF.


The following instructions will refer to configuration under egi.eu CernVM-FS domain.

Setting up a site to access small VOs CernVM-FS repositories located at RAL

Assuming that your site is already supporting CernVM-FS (client v2.1.X) for LHC VOs, one only needs to update / install the cvmfs-keys package to the version 1.5-1 ([2]) on the worker nodes. This package adds the public keys and Stratum-1 server addresses for the egi.eu and opensciencegrid.org domains (no changes for the *.cern.ch repositories).

In addition, one must specify in /etc/cvmfs/default.local the names of the local site squids:

CVMFS_HTTP_PROXY="http://first.squid.domain:3128;http://second.squid.domain:3128"

The last requirement is that the site local squid(s) are configured to use the Stratum-1 servers at RAL, NIKHEF, ASGC and TRIUMF (see /etc/cvmfs/domain.d/egi.eu.conf for the server names).

Also do not forget to define the following environment variables on the batch farm (Note - you need to define them only if your site supports the specific VO):

- for auger users/jobs

VO_AUGER_SW_DIR=/cvmfs/auger.egi.eu

- for biomed users/jobs

VO_BIOMED_SW_DIR=/cvmfs/biomed.egi.eu

- for cernatschool.org users/jobs

VO_CERNATSCHOOL_ORG_SW_DIR=/cvmfs/cernatschool.egi.eu

- for comet.j-parc.jp users/jobs

VO_COMET_J-PARC_JP_SW_DIR=/cvmfs/comet.egi.eu

- for glast.org users/jobs

VO_GLAST_ORG_SW_DIR=/cvmfs/glast.egi.eu

- for hyperk.org users/jobs

VO_HYPERK_ORG_SW_DIR=/cvmfs/hyperk.egi.eu

- for km3net.org users/jobs

VO_KM3NET_ORG_SW_DIR=/cvmfs/km3net.egi.eu

- for mice users/jobs

VO_MICE_SW_DIR=/cvmfs/mice.egi.eu

- for na62 users/jobs

VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.egi.eu

- for pheno users/jobs

VO_PHENO_SW_DIR=/cvmfs/pheno.egi.eu

- for phys.vo.ibergrid.eu users/jobs

VO_PHYS_VO_IBERGRID_EU_SW_DIR=/cvmfs/phys-ibergrid.egi.eu

- for snoplus.snolab.ca users/jobs

VO_SNOPLUS_SNOLAB_CA_SW_DIR=/cvmfs/snoplus.egi.eu

- for t2k.org users/jobs

VO_T2K_ORG_SW_DIR=/cvmfs/t2k.egi.eu

- for enmr.eu users/jobs

VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu

- for vo.northgrid.ac.uk users/jobs

VO_VO_NORTHGRID_AC_UK_SW_DIR=/cvmfs/northgrid.gridpp.ac.uk

- for vo.southgrid.ac.uk users/jobs

VO_VO_SOUTHGRID_AC_UK_SW_DIR=/cvmfs/southgrid.gridpp.ac.uk

- for vo.londongrid.ac.uk users/jobs

VO_VO_LONDONGRID_AC_UK_SW_DIR=/cvmfs/londongrid.gridpp.ac.uk

- for vo.scotgrid.ac.uk users/jobs

VO_VO_SCOTGRID_AC_UK_SW_DIR=/cvmfs/scotgrid.gridpp.ac.uk