GridPP approved VOs new
Contents
- 1 Introduction
- 2 Yum repository
- 3 Cleanup Campaign
- 4 Approved EGI VOs
- 5 Approved Global VOs
- 6 Approved Local VOs
- 7 Other VOs
- 8 Approved VOs being established into GridPP infrastructure
- 9 VOs that have been removed from approved list
- 10 Example site-info.def entries
- 11 VO Resource Requirements
- 12 VO enablement
Introduction
The GridPP Project Management Board has agreed that up to 10 % of GridPP's processing capability should be allocated for non-LHC work. VOs that access the Grid like this must become Approved VOs; policies for managing them are described here: Policies_for_GridPP_approved_VOs.
The tables below indicate VOs that the GridPP PMB has approved, and the PMB encourages support for these VOs at all of its collaborating sites. Information about all European Grid Initiative (EGI), global and local VOs is given in the Operations portal which is the main reference source for VO information (including VO manager, end-points, requirements etc.).
Yum repository
RPM versions of the VOMS records for Approved VOs should be available via the VOMS RPMS Yum Repository TODO:Change RPM download location. The latest version, which is consistent with the records listed below, will be 1.16-1.
Please do not change the vomsdir/ or vomses/ entries or the VO Resource Requirements section below, as they are automatically updated from the EGI Operations Portal. Any changes you make will be lost!
Cleanup Campaign
Approved EGI VOs
Name | Area | Contact
|
---|---|---|
alice | LHC experiment at CERN | |
atlas | LHC experiment at CERN | James Walder |
biomed | Medical image processing and biomedical data processing | |
cms | LHC experiment at CERN | Daniela Bauer
|
dteam | Default VO for EGI/NGI deployment | |
esr | Earth Science Research covering Solid Earth, Ocean, Atmosphere and their interfaces. | |
geant4 | Geant4 is a Monte Carlo simulation toolkit which emulates the interactions of particles. | |
lhcb | LHC experiment at CERN | Mark William Slater |
magic | Gamma ray telescope - Monte Carlo event production | |
vo.moedal.org | The Monopole and Exotics Detector at LHC experiment at CERN - VO ID card. | Tom Whyntie (QMUL) |
planck | Satellite project for mapping Cosmic Microwave Background | |
t2k.org | Next Generation Long Baseline Neutrino Oscillation Experiment | Ben Still (QMUL)
|
ops | The OPS VO is an infrastructure VO that MUST be enabled by all EGI Resource Centres that support the VO concept |
Approved Global VOs
Name | Area | Contact |
---|---|---|
skatelescope.eu | SKA European regional data centre VO ID card | Andrew McNab (Man) |
calice | CAlorimeter for the LInear Collider Experiment | Roman Poeschl
|
ilc | International Linear Collider project (future electron-positron linear collider studies) | |
icecube | Neutrino experiment at the South Pole (Astronomy, Astrophysics and Astro-Particle Physics) | Damian Pieloth, Alessandra Forti |
microboone | Low energy neutrino (Fermilab) | |
zeus | Particle physics experiment on DESY's electron-proton collider (HERA) | |
na62.vo.gridpp.ac.uk | Another CP violation experiment at CERN | Dan Protopopescu (Uni. Glasgow) |
ipv6.hepix.org | Testing of IPv6 of the middleware, applications and tools (HEP, EGI, middleware technology providers and other infrastructures used by WLCG). | Chris Walker, Dave Kelsey |
lsst LSST UK | Large Synoptic Survey Telescope | Gabriele Garzoglio, Iain Goodenow
|
Approved Local VOs
Name | Area | Contact
|
---|---|---|
solidexperiment.org | The SoLid experiment | Daniela Bauer |
gridpp (join) | GridPP is a collaboration of particle physicists and computer scientists from the UK and CERN | Jeremy Coles (Cambridge) |
pheno | A collaboration of UK Particle Physics Phenomenologists who are developing applications for the LHC | Jeppe Andersen / Adam Boutcher / Paul Clark (Durham)
|
mice | A neutrino factory experiments | Paul Hodgson (Sheffield)
|
snoplus.snolab.ca | A Diverse Instrument for Neutrino Research within the SNOLAB Underground facility | Jeanne Wilson (QMU) |
vo.northgrid.ac.uk | Regional VO to allow access to HEP resources to different local disciplines. | Alessandra Forti |
vo.scotgrid.ac.uk | The VO is for academic and other users in the ScotGrid region to test access to EGI and GridPP resources. | David Crooks |
vo.southgrid.ac.uk | The VO is for academic and other users in the SouthGrid region to test access to EGI resources. | Peter Gronbech |
epic.vo.gridpp.ac.uk | Veterinary epidemiology in Scotland | Thomas Doherty |
hyperk.org (join) | The Hyper-Kamiokande experiment | Christopher Walker, Francesca di lodovico
|
cernatschool.org (join) | The CERN@school project. | Steve Lloyd (QML), Tom Whyntie (QML, Langton Star Centre)
|
The VOs below are not in the CIC Portal data | ||
earthsci.vo.gridpp.ac.uk | TBD | TBD |
* | * | * |
Other VOs
This area can be used to record information about VOs that are site specific or localised in a region. This section can be used to advertise a local VO that you would like supported elsewhere.
Name | Area | Contact
|
---|---|---|
vo.landslides.mossaic.org | The landslides VO belongs to the Mossaic project (http://www.bristol.ac.uk/geography/research/hydrology/research/slope/mossiac//). | Luke Kreczko (L.Kreczko@bristol.ac.uk) |
enmr.eu | unk | unk |
none | none | none |
Approved VOs being established into GridPP infrastructure
As part of its commitment to various projects, the GridPP PMB has approved the establishment of the following VOs (your site can not yet support these but when the VO is setup and functioning we will let you know.)
Name | Area | Contact |
---|---|---|
LZ | LZ Dark Matter Experiment | Daniela Bauer, Elena Korolkova, Dan Bradley |
supernemo.org | Searching for Neutrinoless Double Beta Decay | Ben Morgan, Jens Jensen, Paolo Franchini |
fermilab | Umbrella VO for Fermilab | Gabriele Garzoglio, Alessandra Forti |
dune | Deep Underground Neutrino Experiment | Elena Korelkova |
The VOs below are not yet fully synced from the CIC Portal data |
| |
dune | All VOMS servers added by hand | *
|
VOs that have been removed from approved list
The table below comprises a history of VOs that have been removed from the approved list for various reasons.
Name | Date of removal | Notes |
---|---|---|
vo.londongrid.ac.uk | in progress [GGUS] | VO not used any more |
fusion | 30 Jan 2017 | Discussion with Rubén Vallés Pérez. VO appears defunct. |
superbvo.org | 19 Jan 2016 | Discussed at Ops Meeting. Defunct. |
hone | 24 Nov 2015 | Discussed at Ops Meeting. Defunct. |
vo.sixt.cern.ch | 11 Nov 2015 | No members, no voms servers, defunct |
babar | 9 Oct 2013 | none |
camont.gridpp.ac.uk | 9 Oct 2013 | none |
camont | 7th June 2017 | none |
cdf | 7th June 2017 | none |
cedar | 9 Oct 2013 | none |
dzero | 7th June 2017 | none |
ltwo | 9 Oct 2013 | none |
minos.vo.gridpp.ac.uk | 9 Oct 2013 | none |
na48 | 9 Oct 2013 | none |
neiss | 7th June 2017 | none
|
ngs.ac.uk | 9 Oct 2013 | none |
totalep | 9 Oct 2013 | none |
supernemo.vo.eu-egee.org | 24 Feb 2020 | now called supernemo.org |
Example site-info.def entries
The examples of site-info.def entries for yaim have been moved: Example site-info.def entries
Please do not change the vomsdir/ or vomses/ entries below, as they are automatically updated from the EGI Operations Portal.
Virtual Organisation: ALICE |
Filename: /etc/grid-security/vomsdir/alice/lcg-voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/grid-security/vomsdir/alice/voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/vomses/alice-lcg-voms2.cern.ch "alice" "lcg-voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "alice" Filename: /etc/vomses/alice-voms2.cern.ch "alice" "voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "alice"
|
Virtual Organisation: ATLAS |
Filename: /etc/grid-security/vomsdir/atlas/lcg-voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/grid-security/vomsdir/atlas/voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/grid-security/vomsdir/atlas/voms-atlas-auth.app.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/vomses/atlas-lcg-voms2.cern.ch "atlas" "lcg-voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "atlas" Filename: /etc/vomses/atlas-voms2.cern.ch "atlas" "voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "atlas" Filename: /etc/vomses/atlas-voms-atlas-auth.app.cern.ch "atlas" "voms-atlas-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch" "atlas"
|
Virtual Organisation: BIOMED |
Filename: /etc/grid-security/vomsdir/biomed/cclcgvomsli01.in2p3.fr.lsc /O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr /C=FR/O=MENESR/OU=GRID-FR/CN=AC GRID-FR Services Filename: /etc/vomses/biomed-cclcgvomsli01.in2p3.fr "biomed" "cclcgvomsli01.in2p3.fr" "15000" "/O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr" "biomed"
|
Virtual Organisation: CALICE |
Filename: /etc/grid-security/vomsdir/calice/grid-voms.desy.de.lsc /C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de /C=DE/O=GermanGrid/CN=GridKa-CA Filename: /etc/vomses/calice-grid-voms.desy.de "calice" "grid-voms.desy.de" "15102" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "calice"
|
Virtual Organisation: CMS |
Filename: /etc/grid-security/vomsdir/cms/lcg-voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/grid-security/vomsdir/cms/voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/grid-security/vomsdir/cms/voms-cms-auth.app.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/vomses/cms-lcg-voms2.cern.ch "cms" "lcg-voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "cms" Filename: /etc/vomses/cms-voms2.cern.ch "cms" "voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "cms" Filename: /etc/vomses/cms-voms-cms-auth.app.cern.ch "cms" "voms-cms-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch" "cms"
|
Virtual Organisation: DTEAM |
Filename: /etc/grid-security/vomsdir/dteam/voms2.hellasgrid.gr.lsc /C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr /C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016 Filename: /etc/vomses/dteam-voms2.hellasgrid.gr "dteam" "voms2.hellasgrid.gr" "15004" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "dteam"
|
Virtual Organisation: ENMR.EU |
Filename: /etc/grid-security/vomsdir/enmr.eu/voms-02.pd.infn.it.lsc /DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms-02.pd.infn.it /C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4 Filename: /etc/grid-security/vomsdir/enmr.eu/voms2.cnaf.infn.it.lsc /DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms2.cnaf.infn.it /C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4 Filename: /etc/vomses/enmr.eu-voms-02.pd.infn.it "enmr.eu" "voms-02.pd.infn.it" "15014" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms-02.pd.infn.it" "enmr.eu" Filename: /etc/vomses/enmr.eu-voms2.cnaf.infn.it "enmr.eu" "voms2.cnaf.infn.it" "15014" "/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms2.cnaf.infn.it" "enmr.eu"
|
Virtual Organisation: ESR |
Filename: /etc/grid-security/vomsdir/esr/voms.grid.sara.nl.lsc /DC=org/DC=terena/DC=tcs/C=NL/L=Utrecht/O=SURF B.V./CN=voms1.grid.surfsara.nl /C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4 Filename: /etc/vomses/esr-voms.grid.sara.nl "esr" "voms.grid.sara.nl" "30001" "/DC=org/DC=terena/DC=tcs/C=NL/L=Utrecht/O=SURF B.V./CN=voms1.grid.surfsara.nl" "esr"
|
Virtual Organisation: GEANT4 |
Filename: /etc/grid-security/vomsdir/geant4/lcg-voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/grid-security/vomsdir/geant4/voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/vomses/geant4-lcg-voms2.cern.ch "geant4" "lcg-voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "geant4" Filename: /etc/vomses/geant4-voms2.cern.ch "geant4" "voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "geant4"
|
Virtual Organisation: GRIDPP |
Filename: /etc/grid-security/vomsdir/gridpp/voms.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/gridpp/voms02.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/gridpp/voms03.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/vomses/gridpp-voms.gridpp.ac.uk "gridpp" "voms.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "gridpp" Filename: /etc/vomses/gridpp-voms02.gridpp.ac.uk "gridpp" "voms02.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "gridpp" Filename: /etc/vomses/gridpp-voms03.gridpp.ac.uk "gridpp" "voms03.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "gridpp"
|
Virtual Organisation: ILC |
Filename: /etc/grid-security/vomsdir/ilc/grid-voms.desy.de.lsc /C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de /C=DE/O=GermanGrid/CN=GridKa-CA Filename: /etc/vomses/ilc-grid-voms.desy.de "ilc" "grid-voms.desy.de" "15110" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "ilc"
|
Virtual Organisation: LHCB |
Filename: /etc/grid-security/vomsdir/lhcb/lcg-voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/grid-security/vomsdir/lhcb/voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/vomses/lhcb-lcg-voms2.cern.ch "lhcb" "lcg-voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "lhcb" Filename: /etc/vomses/lhcb-voms2.cern.ch "lhcb" "voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "lhcb"
|
Virtual Organisation: MAGIC |
Notes: n/a |
Virtual Organisation: OPS |
Filename: /etc/grid-security/vomsdir/ops/lcg-voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/grid-security/vomsdir/ops/voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/vomses/ops-lcg-voms2.cern.ch "ops" "lcg-voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "ops" Filename: /etc/vomses/ops-voms2.cern.ch "ops" "voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "ops"
|
Virtual Organisation: PHENO |
Filename: /etc/grid-security/vomsdir/pheno/voms.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/pheno/voms02.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/pheno/voms03.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/vomses/pheno-voms.gridpp.ac.uk "pheno" "voms.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "pheno" Filename: /etc/vomses/pheno-voms02.gridpp.ac.uk "pheno" "voms02.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "pheno" Filename: /etc/vomses/pheno-voms03.gridpp.ac.uk "pheno" "voms03.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "pheno"
|
Virtual Organisation: SNOPLUS.SNOLAB.CA |
Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms02.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms03.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/vomses/snoplus.snolab.ca-voms.gridpp.ac.uk "snoplus.snolab.ca" "voms.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "snoplus.snolab.ca" Filename: /etc/vomses/snoplus.snolab.ca-voms02.gridpp.ac.uk "snoplus.snolab.ca" "voms02.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "snoplus.snolab.ca" Filename: /etc/vomses/snoplus.snolab.ca-voms03.gridpp.ac.uk "snoplus.snolab.ca" "voms03.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "snoplus.snolab.ca"
|
Virtual Organisation: T2K.ORG |
Filename: /etc/grid-security/vomsdir/t2k.org/voms.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/t2k.org/voms02.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/t2k.org/voms03.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/vomses/t2k.org-voms.gridpp.ac.uk "t2k.org" "voms.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "t2k.org" Filename: /etc/vomses/t2k.org-voms02.gridpp.ac.uk "t2k.org" "voms02.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "t2k.org" Filename: /etc/vomses/t2k.org-voms03.gridpp.ac.uk "t2k.org" "voms03.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "t2k.org"
|
Virtual Organisation: VO.NORTHGRID.AC.UK |
Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms02.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms03.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/vomses/vo.northgrid.ac.uk-voms.gridpp.ac.uk "vo.northgrid.ac.uk" "voms.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.northgrid.ac.uk" Filename: /etc/vomses/vo.northgrid.ac.uk-voms02.gridpp.ac.uk "vo.northgrid.ac.uk" "voms02.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.northgrid.ac.uk" Filename: /etc/vomses/vo.northgrid.ac.uk-voms03.gridpp.ac.uk "vo.northgrid.ac.uk" "voms03.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.northgrid.ac.uk"
|
Virtual Organisation: VO.SCOTGRID.AC.UK |
Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms02.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms03.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/vomses/vo.scotgrid.ac.uk-voms.gridpp.ac.uk "vo.scotgrid.ac.uk" "voms.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.scotgrid.ac.uk" Filename: /etc/vomses/vo.scotgrid.ac.uk-voms02.gridpp.ac.uk "vo.scotgrid.ac.uk" "voms02.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.scotgrid.ac.uk" Filename: /etc/vomses/vo.scotgrid.ac.uk-voms03.gridpp.ac.uk "vo.scotgrid.ac.uk" "voms03.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.scotgrid.ac.uk"
|
Virtual Organisation: ZEUS |
Filename: /etc/grid-security/vomsdir/zeus/grid-voms.desy.de.lsc /C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de /C=DE/O=GermanGrid/CN=GridKa-CA Filename: /etc/vomses/zeus-grid-voms.desy.de "zeus" "grid-voms.desy.de" "15112" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "zeus"
|
Virtual Organisation: MICE |
Filename: /etc/grid-security/vomsdir/mice/voms.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/mice/voms02.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/mice/voms03.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/vomses/mice-voms.gridpp.ac.uk "mice" "voms.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "mice" Filename: /etc/vomses/mice-voms02.gridpp.ac.uk "mice" "voms02.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "mice" Filename: /etc/vomses/mice-voms03.gridpp.ac.uk "mice" "voms03.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "mice"
|
Virtual Organisation: VO.LANDSLIDES.MOSSAIC.ORG |
Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms02.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms03.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/vomses/vo.landslides.mossaic.org-voms.gridpp.ac.uk "vo.landslides.mossaic.org" "voms.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.landslides.mossaic.org" Filename: /etc/vomses/vo.landslides.mossaic.org-voms02.gridpp.ac.uk "vo.landslides.mossaic.org" "voms02.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.landslides.mossaic.org" Filename: /etc/vomses/vo.landslides.mossaic.org-voms03.gridpp.ac.uk "vo.landslides.mossaic.org" "voms03.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.landslides.mossaic.org"
|
Virtual Organisation: IPV6.HEPIX.ORG |
Filename: /etc/grid-security/vomsdir/ipv6.hepix.org/voms2.cnaf.infn.it.lsc /DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms2.cnaf.infn.it /C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4 Filename: /etc/vomses/ipv6.hepix.org-voms2.cnaf.infn.it "ipv6.hepix.org" "voms2.cnaf.infn.it" "15013" "/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms2.cnaf.infn.it" "ipv6.hepix.org"
|
Virtual Organisation: NA62.VO.GRIDPP.AC.UK |
Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms.gridpp.ac.uk "na62.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "na62.vo.gridpp.ac.uk" Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms02.gridpp.ac.uk "na62.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "na62.vo.gridpp.ac.uk" Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms03.gridpp.ac.uk "na62.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"
|
Virtual Organisation: EPIC.VO.GRIDPP.AC.UK |
Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms.gridpp.ac.uk "epic.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15507" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "epic.vo.gridpp.ac.uk" Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms02.gridpp.ac.uk "epic.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "epic.vo.gridpp.ac.uk" Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms03.gridpp.ac.uk "epic.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"
|
Virtual Organisation: LSST |
Filename: /etc/grid-security/vomsdir/lsst/voms.slac.stanford.edu.lsc /DC=org/DC=incommon/C=US/ST=California/L=Stanford/O=Stanford University/OU=SLAC/CN=voms.slac.stanford.edu /C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA Filename: /etc/vomses/lsst-voms.slac.stanford.edu "lsst" "voms.slac.stanford.edu" "15003" "/DC=org/DC=incommon/C=US/ST=California/L=Stanford/O=Stanford University/OU=SLAC/CN=voms.slac.stanford.edu" "lsst"
|
Virtual Organisation: HYPERK.ORG |
Filename: /etc/grid-security/vomsdir/hyperk.org/voms.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/hyperk.org/voms02.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/hyperk.org/voms03.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/vomses/hyperk.org-voms.gridpp.ac.uk "hyperk.org" "voms.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "hyperk.org" Filename: /etc/vomses/hyperk.org-voms02.gridpp.ac.uk "hyperk.org" "voms02.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "hyperk.org" Filename: /etc/vomses/hyperk.org-voms03.gridpp.ac.uk "hyperk.org" "voms03.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "hyperk.org"
|
Virtual Organisation: FERMILAB |
Filename: /etc/grid-security/vomsdir/fermilab/voms1.fnal.gov.lsc /DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov /C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA Filename: /etc/grid-security/vomsdir/fermilab/voms2.fnal.gov.lsc /DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov /C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA Filename: /etc/vomses/fermilab-voms1.fnal.gov "fermilab" "voms1.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov" "fermilab" Filename: /etc/vomses/fermilab-voms2.fnal.gov "fermilab" "voms2.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov" "fermilab"
|
Virtual Organisation: VO.MOEDAL.ORG |
Filename: /etc/grid-security/vomsdir/vo.moedal.org/lcg-voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/grid-security/vomsdir/vo.moedal.org/voms2.cern.ch.lsc /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch /DC=ch/DC=cern/CN=CERN Grid Certification Authority Filename: /etc/vomses/vo.moedal.org-lcg-voms2.cern.ch "vo.moedal.org" "lcg-voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "vo.moedal.org" Filename: /etc/vomses/vo.moedal.org-voms2.cern.ch "vo.moedal.org" "voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "vo.moedal.org"
|
Virtual Organisation: SKATELESCOPE.EU |
Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms02.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms03.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/vomses/skatelescope.eu-voms.gridpp.ac.uk "skatelescope.eu" "voms.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "skatelescope.eu" Filename: /etc/vomses/skatelescope.eu-voms02.gridpp.ac.uk "skatelescope.eu" "voms02.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "skatelescope.eu" Filename: /etc/vomses/skatelescope.eu-voms03.gridpp.ac.uk "skatelescope.eu" "voms03.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "skatelescope.eu"
|
Virtual Organisation: DUNE |
Filename: /etc/grid-security/vomsdir/dune/voms1.fnal.gov.lsc /DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov /C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA Filename: /etc/grid-security/vomsdir/dune/voms2.fnal.gov.lsc /DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov /C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA Filename: /etc/vomses/dune-voms1.fnal.gov "dune" "voms1.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov" "dune" Filename: /etc/vomses/dune-voms2.fnal.gov "dune" "voms2.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov" "dune"
|
Virtual Organisation: SUPERNEMO.ORG |
Filename: /etc/grid-security/vomsdir/supernemo.org/voms.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/supernemo.org/voms02.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/grid-security/vomsdir/supernemo.org/voms03.gridpp.ac.uk.lsc /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk /C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B Filename: /etc/vomses/supernemo.org-voms.gridpp.ac.uk "supernemo.org" "voms.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "supernemo.org" Filename: /etc/vomses/supernemo.org-voms02.gridpp.ac.uk "supernemo.org" "voms02.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "supernemo.org" Filename: /etc/vomses/supernemo.org-voms03.gridpp.ac.uk "supernemo.org" "voms03.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "supernemo.org"
|
NOTA BENE Please do not change by hand the VO Resource Requirements table below, as it is automatically updated from the CIC Portal.
VO Resource Requirements
Please do not change the table below as it is automatically updated from the EGI Operations Portal. Any changes you make will be lost.
VO | Ram/Core | MaxCPU | MaxWall | Scratch | Other |
---|---|---|---|---|---|
alice | 2000 | 1320 | 1500 | 10000 |
|
atlas | 2048 | 5760 | 5760 | 20000 | Additional runtime requirements:
_ at least 4GB of VM for each job slot Software installation common items: _ the full compiler suite (c/c++ and fortran) should be installed in the WNs, including all the compat-gcc-32* and the SL_libg2c.a_change packages in SL4-like nodes; _ the reccommended version of the compilers is 3.4.6; _ the f2c and libgfortran libraries (in both i386 and x86_64 versions, in case of x86_64 systems) are also required to run the software; _ other libraries required are: libpopt.so.0 libblas.so _ other applications required are: uuencode, uudecode, bc, curl; _ high priority in the batch system for the atlassgm user; _ for nodes running at 64 bits, a copy of python compiled at 32 bits is also needed to use the 32 bits python bindings in the middleware. See https://twiki.cern.ch/twiki/bin/view/Atlas/RPMcompatSLC4 for more details; _ for SL5 nodes please refer to https://twiki.cern.ch/twiki/bin/view/Atlas/RPMCompatSLC5 and https://twiki.cern.ch/twiki/bin/view/Atlas/SL5Migration ; _ for SL6 nodes please refer to https://twiki.cern.ch/twiki/bin/view/AtlasComputing/RPMCompatSLC6 and https://twiki.cern.ch/twiki/bin/view/LCG/SL6Migration Software installation setup (cvmfs sites): _ https://twiki.cern.ch/twiki/bin/view/Atlas/CernVMFS Software installation requirements (non-cvmfs sites): _ an experimental software area (shared filesystem) with at least 500 GB free and reserved for ATLAS.
|
biomed | 100 | 1 | 1 | 100 | For sites providing an SE, minimal required storage space is 1TB.
|
calice | 2048 | 3600 | 5400 | 15000 | CVMFS is used for the software distribution via:
/cvmfs/calice.desy.de For setup instructions refer to: http://grid.desy.de/cvmfs
|
cernatschool.org | 0 | 0 | 0 | 0 |
|
cms | 2000 | 2880 | 4320 | 20000 | Note: The 'Resources' table from above is meant per core. CMS usually sends 8-core pilots.
Jobs require an address space larger than the memory size specified above. Sites should allow processes to use at least 6GB of virtual address space more per core than memory to accommodate the large amount of shared libraries used by jobs. (For a typical 8-core pilot that would translate into a VZSIZE limit of at least 64GB.) Cloud resources should provision 8-core VMs to match standard 8-core pilots. Input I/O requirement is an average 2.5 MB/s per thread from MSS. All jobs need to have outbound connectivity. Sites must not use pool accounts for the FQAN cms:/cms/Role=lcgadmin . For any other CMS job, sites need to use pool accounts so that at any time every grid credential is mapped to an independent local account.
_ glexec must not fail _ should be treated like /cms (base group), in case no special treated is wanted by the site _ proxies with such national groups must be able to write to /store/user/temp (the PFN associated to this LFN)
|
dteam | 0 | 0 | 0 | 0 |
|
dune | 4000 | 2880 | 2880 | 10000 |
|
enmr.eu | 1000 | 2880 | 4320 | 1000 | 1) The line:
"/enmr.eu/*":::: has to be added to group.conf file before configuring via yaim the grid services. In the CREAM-CE this reflects in the lines: "/enmr.eu/*/Role=NULL/Capability=NULL" .enmr "/enmr.eu/*" .enmr of both the files /etc/grid-security/grid-mapfile and /etc/grid-security/voms-grid-mapfile, and in the lines: "/enmr.eu/*/Role=NULL/Capability=NULL" enmr "/enmr.eu/*" enmr of the file /etc/grid-security/groupmapfile. It is required to enable whatever VO group added for implementing per-application accounting. 2) Further, multiple queues should ideally be enabled with different Job Wall Clock Time limits: _ very short: 30 minutes max - for NAGIOS probes, that run with the VO FQAN: /enmr.eu/ops/Role=NULL/Capability=NULL _ short : 120 minutes max _ medium : 12 hours max _ long : 48 hours 3) A WeNMR supported application, Gromacs, run in multithreading mode on multiprocessor boxes (MPI not needed). Please inform the VO managers if your site does not support this kind of jobs. 4) WeNMR software area must be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS and https://wiki.egi.eu/wiki/PROC22. Please do not forget to define on all WNs the environment variable VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu, as pointed out in the above documents.
|
epic.vo.gridpp.ac.uk | 0 | 0 | 0 | 0 |
|
esr | 2048 | 2100 | 0 | 0 | Many applications only need part of the following. Java/Perl/Python/C/C++/FORTRAN77,-90,-95; IDL and MATLAB runtime; Scilab or Octave. Needs MPI for some applications.
Some applications require access to job output during execution, some even interaction via X11. 1 GB RAM; some applications need 3 GB RAM. Outbound connectivity from WN to databases. Shared file system needed for MPI applications, with about 10 GB of space. There are applications needing about 1000 simultaneously open files. Depending on application, output file sizes from some MB to 5 GB, for a total of several hundred thousand files. No permanent storage needed but transient and durable. Low-latency scheduling for short jobs needed.
|
fermilab | 0 | 0 | 0 | 0 |
|
geant4 | 1000 | 650 | 850 | 300 | Software is distributed via CernVM-FS
(http://cernvm.cern.ch/portal/filesystem), configuration should include geant4.cern.ch<http://geant4.cern.ch> and dependency (sft.cern.ch<http://sft.cern.ch>, grid.cern.ch<http://grid.cern.ch>) areas. CernVM-FS needs to be accessed on WN. CernVM-FS Cache area needed is about 5GB.
|
gridpp | 1000 | 1000 | 0 | 0 |
|
hyperk.org | 2000 | 1440 | 1440 | 10000 |
|
icecube | 4000 | 2880 | 2880 | 40000 | CVMFS is used for the software distribution via:
/cvmfs/icecube.opensciencegrid.org
|
ilc | 2048 | 3600 | 5400 | 15000 | CVMFS is used for the software distribution via:
/cvmfs/ilc.desy.de For setup instructions refer to: http://grid.desy.de/cvmfs
|
ipv6.hepix.org | 0 | 0 | 0 | 0 |
|
vo.landslides.mossaic.org | 0 | 0 | 0 | 0 |
|
lhcb | 4000 | 0 | 0 | 20000 | Further recommendations from LHCb for sites:
The amount of memory in the field "Max used physical non-swap X86_64 memory size" of the resources section is understood to be the virtual memory required per single process of a LHCb payload. Usually LHCb payloads consist of one "worker process", consuming the majority of memory, and several wrapper processes. The total amount of virtual memory for all wrapper processes accounts for 1 GB which needs to be added as a requirement to the field "Max used physical non-swap X86_64 memory size" in case the virtual memory of the whole process tree is monitored. The amount of space in field "Max size of scratch space used by jobs", shall be interpreted as 50 % each for downloaded input files and produced output files. Sites having migrated to the Centos7 (including "Cern Centos7") operating system or later versions are requested to provide support for singularity containers and user namespaces. The latter can be checked by ensuring that /proc/sys/user/max_user_namespaces contains a large number. The underlying OS should provide the libraries, binaries, and scripts required by the current HEP_OSlibs RPM meta package. The shared software area shall be provided via CVMFS. LHCb uses the mount points /cvmfs/lhcb.cern.ch, /cvmfs/lhcb-condb.cern.ch, /cvmfs/grid.cern.ch and /cvmfs/cernvm-prod.cern.ch on the worker nodes. Provisioning of a reasonable number of slots per disk server, proportional to the maximum number of concurrent jobs at the site. Non T1 sites providing CVMFS, direct HTCondorCE, ARC, or CREAM submission and the requested amount of local scratch space will be considered as candidates for additional workloads (e.g. data reprocessing campaign). Sites not having an SRM installation must provide: _ disk only storage _ a GRIDFPT endpoint (a single dns entry) _ an XROOT endpoint (a single dns entry) _ a way to do the accounting (preferably following the WLCG TF standard: https://twiki.cern.ch/twiki/bin/view/LCG/StorageSpaceAccounting)
|
lsst | 0 | 0 | 0 | 0 | VO name must be "lsst" as it is an existing VO in OSG!
cf VOMS URL
|
lz | 0 | 0 | 0 | 0 |
|
magic | 1024 | 5000 | 0 | 0 | Fortran77 and other compilers. See details in annex of MoU (documentation section).
|
mice | 0 | 0 | 0 | 0 |
|
vo.moedal.org | 0 | 0 | 0 | 0 |
|
na62.vo.gridpp.ac.uk | 2048 | 500 | 720 | 2048 | VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.cern.ch
Need also access to /cvmfs/geant4.cern.ch and /cvmfs/sft.cern.ch
|
vo.northgrid.ac.uk | 0 | 0 | 0 | 0 |
|
ops | 0 | 0 | 0 | 0 |
|
pheno | 0 | 0 | 0 | 0 |
|
planck | 0 | 950 | 0 | 0 | Need access to job output during execution
Need R-GMA for monitoring RAM 1GB Scratch 200GB SE for durable files (not permanent) Java/Perl/Python/C/C++/Fortran90,-95,-77/Octave IDL(commercial) where available
|
skatelescope.eu | 2000 | 2880 | 2880 | 40000 |
|
snoplus.snolab.ca | 2000 | 1440 | 2160 | 20000 | g++
gcc python-devel uuid-devel zlib-devel SNO+ software area should be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS.
|
solidexperiment.org | 0 | 0 | 0 | 0 | will need to set up CVMFS.
|
vo.southgrid.ac.uk | 0 | 0 | 0 | 0 |
|
supernemo.org | null | null | null | null |
|
t2k.org | 1500 | 600 | 600 | 1000 | t2k.org software should be mounted on WNs via CVMFS as defined at https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS
|
zeus | 2048 | 3600 | 5400 | 5000 | CVMFS is used for the software distribution via:
/cvmfs/zeus.desy.de For setup instructions refer to: http://grid.desy.de/cvmfs
|
Maximum: | 4000 | 5760 | 5760 | 40000 |
VO enablement
The VOs that are enabled at each site are listed in a VO table.
This page is a Key Document, and is the responsibility of Gerard Hand. It was last reviewed on 2022-03-28 when it was considered to be 0% complete. It was last judged to be accurate on 2020-03-28.