Difference between revisions of "GridPP approved VOs"

From GridPP Wiki
Jump to: navigation, search
(VOs that have been removed from approved list)
Line 1,241: Line 1,241:
 
n/a
 
n/a
 
}}
 
}}
 +
  
 
{{BOX VO|SUPERNEMO.ORG|<!-- VOMS RECORDS for SUPERNEMO.ORG -->
 
{{BOX VO|SUPERNEMO.ORG|<!-- VOMS RECORDS for SUPERNEMO.ORG -->
Line 1,280: Line 1,281:
 
n/a
 
n/a
 
}}
 
}}
 +
  
 
{{BOX VO|T2K.ORG|<!-- VOMS RECORDS for T2K.ORG -->
 
{{BOX VO|T2K.ORG|<!-- VOMS RECORDS for T2K.ORG -->

Revision as of 12:19, 24 February 2020

Introduction

The GridPP Project Management Board has agreed that up to 10 % of GridPP's processing capability should be allocated for non-LHC work. VOs that access the Grid like this must become Approved VOs; policies for managing them are described here: Policies_for_GridPP_approved_VOs.

The tables below indicate VOs that the GridPP PMB has approved, and the PMB encourages support for these VOs at all of its collaborating sites. Information about all European Grid Initiative (EGI), global and local VOs is given in the Operations portal which is the main reference source for VO information (including VO manager, end-points, requirements etc.).

Yum repository

RPM versions of the VOMS records for Approved VOs should be available via the VOMS RPMS Yum Repository. The latest version, which is consistent with the records listed below, will be 1.14-1.


NOTA BENE

Some sections in this document are automatically updated from the CIC Portal (approximately once a week). Please do not change the the vomsdir/ or vomses/ entries below, or the VO Resource Requirements sections.

Approved EGI VOs

Approved EGI VOs]
Name Area Contact


alice LHC experiment at CERN
atlas LHC experiment at CERN
biomed Medical image processing and biomedical data processing
cms LHC experiment at CERN


dteam Default VO for EGI/NGI deployment
esr Earth Science Research covering Solid Earth, Ocean, Atmosphere and their interfaces.
geant4 Geant4 is a Monte Carlo simulation toolkit which emulates the interactions of particles.
lhcb LHC experiment at CERN
magic Gamma ray telescope - Monte Carlo event production
vo.moedal.org The Monopole and Exotics Detector at LHC experiment at CERN - VO ID card - VOMS server. Jonathan Hays (QMUL)
planck Satellite project for mapping Cosmic Microwave Background
t2k.org Next Generation Long Baseline Neutrino Oscillation Experiment Ben Still (QMUL)


ops The OPS VO is an infrastructure VO that MUST be enabled by all EGI Resource Centres that support the VO concept

Approved Global VOs

Approved Global VOs
Name Area Contact
skatelescope.eu SKA European regional data centre VO ID card - VOMS server. Andrew McNab (Man)
calice CAlorimeter for the LInear Collider Experiment Roman Poeschl


ilc International Linear Collider project (future electron-positron linear collider studies)
icecube Neutrino experiment at the South Pole (Astronomy, Astrophysics and Astro-Particle Physics) Damian Pieloth, Alessandra Forti
microboone Low energy neutrino (Fermilab)
zeus Particle physics experiment on DESY's electron-proton collider (HERA)
na62.vo.gridpp.ac.uk Another CP violation experiment at CERN Dan Protopopescu (Uni. Glasgow)
ipv6.hepix.org Testing of IPv6 of the middleware, applications and tools (HEP, EGI, middleware technology providers and other infrastructures used by WLCG). Chris Walker, Dave Kelsey
lsst LSST UK Large Synoptic Survey Telescope Gabriele Garzoglio, Iain Goodenow



Approved Local VOs

Approved Local VOs
Name Area Contact


solidexperiment.org The SoLid experiment Daniela Bauer
gridpp (join) GridPP is a collaboration of particle physicists and computer scientists from the UK and CERN Jeremy Coles (Cambridge)
pheno A collaboration of UK Particle Physics Phenomenologists who are developing applications for the LHC David Grellscheid (Durham)


mice A neutrino factory experiments Paul Hodgson (Sheffield)


snoplus.snolab.ca A Diverse Instrument for Neutrino Research within the SNOLAB Underground facility Jeanne Wilson (QMU), Matt Mottram (Uni. Sussex)
vo.northgrid.ac.uk (homepage, users) Regional VO to allow access to HEP resources to different local disciplines. Alessandra Forti
vo.scotgrid.ac.uk (homepage) The VO is for academic and other users in the ScotGrid region to test access to EGI and GridPP resources. David Crooks
vo.southgrid.ac.uk (homepage) The VO is for academic and other users in the SouthGrid region to test access to EGI resources. Peter Gronbech
epic.vo.gridpp.ac.uk (join) Veterinary epidemiology in Scotland Thomas Doherty
hyperk.org (join) The Hyper-Kamiokande experiment Christopher Walker, Francesca di lodovico


cernatschool.org (join) The CERN@school project. Steve Lloyd (QML), Tom Whyntie (QML, Langton Star Centre)


The VOs below are not in the CIC Portal data
earthsci.vo.gridpp.ac.uk TBD TBD
* * *

Other VOs

This area can be used to record information about VOs that are site specific or localised in a region. This section can be used to advertise a local VO that you would like supported elsewhere.

Other VOs
Name Area Contact


vo.landslides.mossaic.org The landslides VO belongs to the Mossaic project (http://mossaic.org/). Luke Kreczko (L.Kreczko@bristol.ac.uk)
enmr.eu unk unk
none none none

Approved VOs being established into GridPP infrastructure

As part of its commitment to various projects, the GridPP PMB has approved the establishment of the following VOs (your site can not yet support these but when the VO is setup and functioning we will let you know.)

VOs being established
Name Area Contact
supernemo.org Searching for Neutrinoless Double Beta Decay Ben Morgan, Jens Jensen, Paolo Franchini
LZ LZ Dark Matter Experiment Daniela Bauer, Elena Korolkova, Dan Bradley
fermilab Umbrella VO for Fermilab Gabriele Garzoglio, Alessandra Forti
dune Deep Underground Neutrino Experiment Elena Korelkova
The VOs below are not yet fully synced from the CIC Portal data
lz One VOMS server, at Imperial, is added by hand. *
dune All VOMS servers added by hand *



VOs that have been removed from approved list

The table below comprises a history of VOs that have been removed from the approved list for various reasons.

VOs that have been removed
Name Date of removal Notes
vo.londongrid.ac.uk in progress [GGUS] VO not used any more
fusion 30 Jan 2017 Discussion with Rubén Vallés Pérez. VO appears defunct.
superbvo.org 19 Jan 2016 Discussed at Ops Meeting. Defunct.
hone 24 Nov 2015 Discussed at Ops Meeting. Defunct.
vo.sixt.cern.ch 11 Nov 2015 No members, no voms servers, defunct
babar 9 Oct 2013 none
camont.gridpp.ac.uk 9 Oct 2013 none
camont 7th June 2017 none
cdf 7th June 2017 none
cedar 9 Oct 2013 none
dzero 7th June 2017 none
ltwo 9 Oct 2013 none
minos.vo.gridpp.ac.uk 9 Oct 2013 none
na48 9 Oct 2013 none
neiss 7th June 2017 none


ngs.ac.uk 9 Oct 2013 none
totalep 9 Oct 2013 none
supernemo.vo.eu-egee.org 24 Feb 2020 now called supernemo.org

Example site-info.def entries

The examples of site-info.def entries for yaim have been moved: Example site-info.def entries


NOTA BENE

Please do not change the vomsdir/ or vomses/ entries below, as they are automatically updated from the CIC Portal.



Virtual Organisation: ALICE

Filename: /etc/grid-security/vomsdir/alice/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/alice/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/alice-lcg-voms2.cern.ch

"alice" "lcg-voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "alice"

Filename: /etc/vomses/alice-voms2.cern.ch

"alice" "voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "alice"


Notes: n/a


Virtual Organisation: ATLAS

Filename: /etc/grid-security/vomsdir/atlas/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/atlas/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/atlas-lcg-voms2.cern.ch

"atlas" "lcg-voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "atlas"

Filename: /etc/vomses/atlas-voms2.cern.ch

"atlas" "voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "atlas"


Notes: n/a


Virtual Organisation: BIOMED

Filename: /etc/grid-security/vomsdir/biomed/cclcgvomsli01.in2p3.fr.lsc

/O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr
/C=FR/O=MENESR/OU=GRID-FR/CN=AC GRID-FR Services

Filename: /etc/vomses/biomed-cclcgvomsli01.in2p3.fr

"biomed" "cclcgvomsli01.in2p3.fr" "15000" "/O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr" "biomed"


Notes: n/a


Virtual Organisation: CMS

Filename: /etc/grid-security/vomsdir/cms/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/cms/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/cms-lcg-voms2.cern.ch

"cms" "lcg-voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "cms"

Filename: /etc/vomses/cms-voms2.cern.ch

"cms" "voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "cms"


Notes: n/a


Virtual Organisation: DTEAM

Filename: /etc/grid-security/vomsdir/dteam/voms2.hellasgrid.gr.lsc

/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr
/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016

Filename: /etc/vomses/dteam-voms2.hellasgrid.gr

"dteam" "voms2.hellasgrid.gr" "15004" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "dteam"


Notes: n/a


Virtual Organisation: DUNE

Filename: /etc/grid-security/vomsdir/dune/voms1.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/grid-security/vomsdir/dune/voms2.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/vomses/dune-voms1.fnal.gov

"dune" "voms1.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov" "dune"

Filename: /etc/vomses/dune-voms2.fnal.gov

"dune" "voms2.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov" "dune"


Notes: n/a


Virtual Organisation: ESR

Filename: /etc/grid-security/vomsdir/esr/voms.grid.sara.nl.lsc

/O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl
/C=NL/O=NIKHEF/CN=NIKHEF medium-security certification auth

Filename: /etc/vomses/esr-voms.grid.sara.nl

"esr" "voms.grid.sara.nl" "30001" "/O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl" "esr"


Notes: n/a


Virtual Organisation: EPIC.VO.GRIDPP.AC.UK

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15507" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms02.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms03.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"


Notes: n/a


Virtual Organisation: FERMILAB

Filename: /etc/grid-security/vomsdir/fermilab/voms1.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/grid-security/vomsdir/fermilab/voms2.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/vomses/fermilab-voms1.fnal.gov

"fermilab" "voms1.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov" "fermilab"

Filename: /etc/vomses/fermilab-voms2.fnal.gov

"fermilab" "voms2.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov" "fermilab"


Notes: n/a


Virtual Organisation: GEANT4

Filename: /etc/grid-security/vomsdir/geant4/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/geant4/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/geant4-lcg-voms2.cern.ch

"geant4" "lcg-voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "geant4"

Filename: /etc/vomses/geant4-voms2.cern.ch

"geant4" "voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "geant4"


Notes: n/a


Virtual Organisation: GRIDPP

Filename: /etc/grid-security/vomsdir/gridpp/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/gridpp/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/gridpp/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/gridpp-voms.gridpp.ac.uk

"gridpp" "voms.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "gridpp"

Filename: /etc/vomses/gridpp-voms02.gridpp.ac.uk

"gridpp" "voms02.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "gridpp"

Filename: /etc/vomses/gridpp-voms03.gridpp.ac.uk

"gridpp" "voms03.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "gridpp"


Notes: n/a


Virtual Organisation: HYPERK.ORG

Filename: /etc/grid-security/vomsdir/hyperk.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/hyperk.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/hyperk.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/hyperk.org-voms.gridpp.ac.uk

"hyperk.org" "voms.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "hyperk.org"

Filename: /etc/vomses/hyperk.org-voms02.gridpp.ac.uk

"hyperk.org" "voms02.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "hyperk.org"

Filename: /etc/vomses/hyperk.org-voms03.gridpp.ac.uk

"hyperk.org" "voms03.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "hyperk.org"


Notes: n/a


Virtual Organisation: ICECUBE

Filename: /etc/grid-security/vomsdir/icecube/grid-voms.desy.de.lsc

/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
/C=DE/O=GermanGrid/CN=GridKa-CA

Filename: /etc/vomses/icecube-grid-voms.desy.de

"icecube" "grid-voms.desy.de" "15106" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "icecube"


Notes: n/a


Virtual Organisation: ILC

Filename: /etc/grid-security/vomsdir/ilc/grid-voms.desy.de.lsc

/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
/C=DE/O=GermanGrid/CN=GridKa-CA

Filename: /etc/vomses/ilc-grid-voms.desy.de

"ilc" "grid-voms.desy.de" "15110" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "ilc"


Notes: n/a


Virtual Organisation: IPV6.HEPIX.ORG

Filename: /etc/grid-security/vomsdir/ipv6.hepix.org/voms2.cnaf.infn.it.lsc

/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms2.cnaf.infn.it
/C=IT/O=INFN/CN=INFN Certification Authority

Filename: /etc/vomses/ipv6.hepix.org-voms2.cnaf.infn.it

"ipv6.hepix.org" "voms2.cnaf.infn.it" "15013" "/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms2.cnaf.infn.it" "ipv6.hepix.org"


Notes: n/a


Virtual Organisation: LHCB

Filename: /etc/grid-security/vomsdir/lhcb/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/lhcb/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/lhcb-lcg-voms2.cern.ch

"lhcb" "lcg-voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "lhcb"

Filename: /etc/vomses/lhcb-voms2.cern.ch

"lhcb" "voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "lhcb"


Notes: n/a


Virtual Organisation: LSST

Filename: /etc/grid-security/vomsdir/lsst/voms.slac.stanford.edu.lsc

/DC=org/DC=incommon/C=US/ST=CA/L=Stanford/O=Stanford University/OU=SLAC/CN=voms.slac.stanford.edu
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/vomses/lsst-voms.slac.stanford.edu

"lsst" "voms.slac.stanford.edu" "15003" "/DC=org/DC=incommon/C=US/ST=CA/L=Stanford/O=Stanford University/OU=SLAC/CN=voms.slac.stanford.edu" "lsst"


Notes:

voms.fnal.gov is only an admin interface. It should not be configured on the machines because it cannot give proxies.

Sites supporting lsst are advised to read GGUS 117587.

(former advice was "It would not do any harm to have it on service nodes but should not be installed on any UI.")



Virtual Organisation: LZ

Filename: /etc/grid-security/vomsdir/lz/lzvoms.grid.hep.ph.ic.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=lzvoms.grid.hep.ph.ic.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/lz/voms.hep.wisc.edu.lsc

/DC=org/DC=incommon/C=US/ST=WI/L=Madison/O=University of Wisconsin-Madison/OU=OCIS/CN=voms.hep.wisc.edu
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/vomses/lz-lzvoms.grid.hep.ph.ic.ac.uk

"lz" "lzvoms.grid.hep.ph.ic.ac.uk" "15001" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=lzvoms.grid.hep.ph.ic.ac.uk" "lz"

Filename: /etc/vomses/lz-voms.hep.wisc.edu

"lz" "voms.hep.wisc.edu" "15001" "/DC=org/DC=incommon/C=US/ST=WI/L=Madison/O=University of Wisconsin-Madison/OU=OCIS/CN=voms.hep.wisc.edu" "lz"


Notes:

Daniela Bauer and Simon Fayer provide this information for LZ

 Site Specific Settings:
   VO_LZ_SW_DIR=/cvmfs/lz.opensciencegrid.org
   VO_LZ_DEFAULT_SE=gfe02.grid.hep.ph.ic.ac.uk
 UI only:
   WMS_HOSTS="wms01.grid.hep.ph.ic.ac.uk"
 Must be set explicitly for the WMS, true elsewhere:
   MAP_WILDCARDS=yes

Simon points to this wrt cvmfs:

 https://github.com/ATLASConnect/PortableCVMFS/tree/master/conf



Virtual Organisation: MAGIC

Filename: /etc/grid-security/vomsdir/magic/voms01.pic.es.lsc

/DC=org/DC=terena/DC=tcs/C=ES/ST=Barcelona/L=Bellaterra/O=Port dInformacio Cientifica/CN=voms01.pic.es
/C=NL/ST=Noord-Holland/L=Amsterdam/O=TERENA/CN=TERENA eScience SSL CA 3

Filename: /etc/grid-security/vomsdir/magic/voms02.pic.es.lsc

/DC=org/DC=terena/DC=tcs/C=ES/ST=Barcelona/L=Bellaterra/O=Port dInformacio Cientifica/CN=voms02.pic.es
/C=NL/ST=Noord-Holland/L=Amsterdam/O=TERENA/CN=TERENA eScience SSL CA 3

Filename: /etc/vomses/magic-voms01.pic.es

"magic" "voms01.pic.es" "15003" "/DC=org/DC=terena/DC=tcs/C=ES/ST=Barcelona/L=Bellaterra/O=Port dInformacio Cientifica/CN=voms01.pic.es" "magic"

Filename: /etc/vomses/magic-voms02.pic.es

"magic" "voms02.pic.es" "15003" "/DC=org/DC=terena/DC=tcs/C=ES/ST=Barcelona/L=Bellaterra/O=Port dInformacio Cientifica/CN=voms02.pic.es" "magic"


Notes: n/a


Virtual Organisation: MICE

Filename: /etc/grid-security/vomsdir/mice/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/mice/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/mice/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/mice-voms.gridpp.ac.uk

"mice" "voms.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "mice"

Filename: /etc/vomses/mice-voms02.gridpp.ac.uk

"mice" "voms02.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "mice"

Filename: /etc/vomses/mice-voms03.gridpp.ac.uk

"mice" "voms03.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "mice"


Notes: n/a



Virtual Organisation: NA62.VO.GRIDPP.AC.UK

Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms.gridpp.ac.uk

"na62.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"


Notes: n/a


Virtual Organisation: OPS

Filename: /etc/grid-security/vomsdir/ops/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/ops/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/ops-lcg-voms2.cern.ch

"ops" "lcg-voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "ops"

Filename: /etc/vomses/ops-voms2.cern.ch

"ops" "voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "ops"


Notes: n/a


Virtual Organisation: PHENO

Filename: /etc/grid-security/vomsdir/pheno/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/pheno/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/pheno/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/pheno-voms.gridpp.ac.uk

"pheno" "voms.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "pheno"

Filename: /etc/vomses/pheno-voms02.gridpp.ac.uk

"pheno" "voms02.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "pheno"

Filename: /etc/vomses/pheno-voms03.gridpp.ac.uk

"pheno" "voms03.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "pheno"


Notes: n/a


Virtual Organisation: PLANCK

Filename: /etc/grid-security/vomsdir/planck/voms.cnaf.infn.it.lsc

/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it
/C=IT/O=INFN/CN=INFN Certification Authority

Filename: /etc/vomses/planck-voms.cnaf.infn.it

"planck" "voms.cnaf.infn.it" "15002" "/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it" "planck"


Notes: n/a


Virtual Organisation: SNOPLUS.SNOLAB.CA

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/snoplus.snolab.ca-voms.gridpp.ac.uk

"snoplus.snolab.ca" "voms.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "snoplus.snolab.ca"

Filename: /etc/vomses/snoplus.snolab.ca-voms02.gridpp.ac.uk

"snoplus.snolab.ca" "voms02.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "snoplus.snolab.ca"

Filename: /etc/vomses/snoplus.snolab.ca-voms03.gridpp.ac.uk

"snoplus.snolab.ca" "voms03.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "snoplus.snolab.ca"


Notes: n/a


Virtual Organisation: SKATELESCOPE.EU

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/skatelescope.eu-voms.gridpp.ac.uk

"skatelescope.eu" "voms.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "skatelescope.eu"

Filename: /etc/vomses/skatelescope.eu-voms02.gridpp.ac.uk

"skatelescope.eu" "voms02.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "skatelescope.eu"

Filename: /etc/vomses/skatelescope.eu-voms03.gridpp.ac.uk

"skatelescope.eu" "voms03.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "skatelescope.eu"


Notes: n/a


Virtual Organisation: SOLIDEXPERIMENT.ORG

Filename: /etc/grid-security/vomsdir/solidexperiment.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/solidexperiment.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/solidexperiment.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/solidexperiment.org-voms.gridpp.ac.uk

"solidexperiment.org" "voms.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "solidexperiment.org"

Filename: /etc/vomses/solidexperiment.org-voms02.gridpp.ac.uk

"solidexperiment.org" "voms02.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "solidexperiment.org"

Filename: /etc/vomses/solidexperiment.org-voms03.gridpp.ac.uk

"solidexperiment.org" "voms03.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "solidexperiment.org"


Notes: n/a


Virtual Organisation: SUPERNEMO.ORG

Filename: /etc/grid-security/vomsdir/supernemo.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/supernemo.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/supernemo.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/supernemo.org-voms.gridpp.ac.uk

"supernemo.org" "voms.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "supernemo.org"

Filename: /etc/vomses/supernemo.org-voms02.gridpp.ac.uk

"supernemo.org" "voms02.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "supernemo.org"

Filename: /etc/vomses/supernemo.org-voms03.gridpp.ac.uk

"supernemo.org" "voms03.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "supernemo.org"


Notes: n/a


Virtual Organisation: T2K.ORG

Filename: /etc/grid-security/vomsdir/t2k.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/t2k.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/t2k.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/t2k.org-voms.gridpp.ac.uk

"t2k.org" "voms.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "t2k.org"

Filename: /etc/vomses/t2k.org-voms02.gridpp.ac.uk

"t2k.org" "voms02.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "t2k.org"

Filename: /etc/vomses/t2k.org-voms03.gridpp.ac.uk

"t2k.org" "voms03.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "t2k.org"


Notes: n/a


Virtual Organisation: ZEUS

Filename: /etc/grid-security/vomsdir/zeus/grid-voms.desy.de.lsc

/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
/C=DE/O=GermanGrid/CN=GridKa-CA

Filename: /etc/vomses/zeus-grid-voms.desy.de

"zeus" "grid-voms.desy.de" "15112" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "zeus"


Notes: n/a


Virtual Organisation: CALICE

Filename: /etc/grid-security/vomsdir/calice/grid-voms.desy.de.lsc

/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
/C=DE/O=GermanGrid/CN=GridKa-CA

Filename: /etc/vomses/calice-grid-voms.desy.de

"calice" "grid-voms.desy.de" "15102" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "calice"


Notes: n/a


Virtual Organisation: VO.MOEDAL.ORG

Filename: /etc/grid-security/vomsdir/vo.moedal.org/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/vo.moedal.org/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/vo.moedal.org-lcg-voms2.cern.ch

"vo.moedal.org" "lcg-voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "vo.moedal.org"

Filename: /etc/vomses/vo.moedal.org-voms2.cern.ch

"vo.moedal.org" "voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "vo.moedal.org"


Notes: n/a


Virtual Organisation: VO.NORTHGRID.AC.UK

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.northgrid.ac.uk-voms.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.northgrid.ac.uk"

Filename: /etc/vomses/vo.northgrid.ac.uk-voms02.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms02.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.northgrid.ac.uk"

Filename: /etc/vomses/vo.northgrid.ac.uk-voms03.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms03.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.northgrid.ac.uk"


Notes: n/a


Virtual Organisation: VO.SOUTHGRID.AC.UK

Filename: /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.southgrid.ac.uk-voms.gridpp.ac.uk

"vo.southgrid.ac.uk" "voms.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.southgrid.ac.uk"

Filename: /etc/vomses/vo.southgrid.ac.uk-voms02.gridpp.ac.uk

"vo.southgrid.ac.uk" "voms02.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.southgrid.ac.uk"

Filename: /etc/vomses/vo.southgrid.ac.uk-voms03.gridpp.ac.uk

"vo.southgrid.ac.uk" "voms03.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.southgrid.ac.uk"


Notes: n/a


Virtual Organisation: VO.LANDSLIDES.MOSSAIC.ORG

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.landslides.mossaic.org-voms.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.landslides.mossaic.org"

Filename: /etc/vomses/vo.landslides.mossaic.org-voms02.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms02.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.landslides.mossaic.org"

Filename: /etc/vomses/vo.landslides.mossaic.org-voms03.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms03.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.landslides.mossaic.org"


Notes: n/a


Virtual Organisation: ENMR.EU

Filename: /etc/grid-security/vomsdir/enmr.eu/voms-02.pd.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/CN=voms-02.pd.infn.it
/C=NL/ST=Noord-Holland/L=Amsterdam/O=TERENA/CN=TERENA eScience SSL CA 3

Filename: /etc/grid-security/vomsdir/enmr.eu/voms2.cnaf.infn.it.lsc

/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms2.cnaf.infn.it
/C=IT/O=INFN/CN=INFN Certification Authority

Filename: /etc/vomses/enmr.eu-voms-02.pd.infn.it

"enmr.eu" "voms-02.pd.infn.it" "15014" "/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/CN=voms-02.pd.infn.it" "enmr.eu"

Filename: /etc/vomses/enmr.eu-voms2.cnaf.infn.it

"enmr.eu" "voms2.cnaf.infn.it" "15014" "/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms2.cnaf.infn.it" "enmr.eu"


Notes: n/a


Virtual Organisation: CERNATSCHOOL.ORG

Filename: /etc/grid-security/vomsdir/cernatschool.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/cernatschool.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/cernatschool.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/cernatschool.org-voms.gridpp.ac.uk

"cernatschool.org" "voms.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "cernatschool.org"

Filename: /etc/vomses/cernatschool.org-voms02.gridpp.ac.uk

"cernatschool.org" "voms02.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "cernatschool.org"

Filename: /etc/vomses/cernatschool.org-voms03.gridpp.ac.uk

"cernatschool.org" "voms03.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "cernatschool.org"


Notes: n/a


Virtual Organisation: EARTHSCIENCE (Note that this VO is not in the CIC Portal)
VOMS_SERVERS=TBD
VOMSES=TBD
VOMS_CA_DN=TBD

Notes: n/a



NOTA BENE Please do not change by hand the VO Resource Requirements table below, as it is automatically updated from the CIC Portal.

VO Resource Requirements

VO Resource Requirements
VO Ram/Core MaxCPU MaxWall Scratch Other
alice 2000 1320 1500 10000


atlas 2048 3120 5760 20000 Additional runtime requirements:
_ at least 4GB of VM for each job slot

Software installation common items:

_ the full compiler suite (c/c++ and fortran) should be installed in the WNs, including all the compat-gcc-32* and the  SL_libg2c.a_change packages in SL4-like nodes;
_ the reccommended version of the compilers is 3.4.6;
_ the f2c and libgfortran libraries (in both i386 and x86_64 versions, in case of x86_64 systems) are also required to run the software;
_ other libraries required are:
 libpopt.so.0
 libblas.so
_ other applications required are: uuencode, uudecode, bc, curl;
_ high priority in the batch system for the atlassgm user;
_ for nodes running at 64 bits, a copy of python compiled at 32 bits is also needed to use the 32 bits python bindings in the middleware. See https://twiki.cern.ch/twiki/bin/view/Atlas/RPMcompatSLC4 for more details;
_ for SL5 nodes please refer to https://twiki.cern.ch/twiki/bin/view/Atlas/RPMCompatSLC5 and https://twiki.cern.ch/twiki/bin/view/Atlas/SL5Migration ;
_ for SL6 nodes please refer to https://twiki.cern.ch/twiki/bin/view/AtlasComputing/RPMCompatSLC6 and https://twiki.cern.ch/twiki/bin/view/LCG/SL6Migration

Software installation setup (cvmfs sites):

_ https://twiki.cern.ch/twiki/bin/view/Atlas/CernVMFS

Software installation requirements (non-cvmfs sites):

_ an experimental software area (shared filesystem) with at least 500 GB free and reserved for ATLAS.


biomed 100 1 1 100 For sites providing an SE, minimal required storage space is 1TB.


calice 2048 3600 5400 15000 CVMFS is used for the software distribution via:
 /cvmfs/calice.desy.de 

For setup instructions refer to:

  http://grid.desy.de/cvmfs


cernatschool.org 0 0 0 0


cms 2000 2880 4320 20000 Note: The 'Resources' table from above is meant per core. CMS usually sends 8-core pilots.

Jobs require an address space larger than the memory size specified above. Sites should allow processes to use at least 6GB of virtual address space more per core than memory to accommodate the large amount of shared libraries used by jobs. (For a typical 8-core pilot that would translate into a VZSIZE limit of at least 64GB.)

Cloud resources should provision 8-core VMs to match standard 8-core pilots.

Input I/O requirement is an average 2.5 MB/s per thread from MSS.

All jobs need to have outbound connectivity.

Sites must not use pool accounts for the FQAN cms:/cms/Role=lcgadmin . For any other CMS job, sites need to use pool accounts so that at any time every grid credential is mapped to an independent local account.


National VOMS groups: In CMS national VOMS groups, e.g. /cms/becms or /cms/dcms, are used. Those proxies must be "supported" at all sites in the following way:

_ glexec must not fail
_ should be treated like /cms (base group), in case no special treated is wanted by the site
_ proxies with such national groups must be able to write to /store/user/temp (the PFN associated to this LFN)


dteam 0 0 0 0


dune 4000 2880 2880 10000


enmr.eu 1000 2880 4320 1000 1) The line:

"/enmr.eu/*":::: has to be added to group.conf file before configuring via yaim the grid services. In the CREAM-CE this reflects in the lines: "/enmr.eu/*/Role=NULL/Capability=NULL" .enmr "/enmr.eu/*" .enmr of both the files /etc/grid-security/grid-mapfile and /etc/grid-security/voms-grid-mapfile, and in the lines: "/enmr.eu/*/Role=NULL/Capability=NULL" enmr "/enmr.eu/*" enmr of the file /etc/grid-security/groupmapfile. It is required to enable whatever VO group added for implementing per-application accounting.

2) Further, multiple queues should ideally be enabled with different Job Wall Clock Time limits:

_ very short: 30 minutes max - for NAGIOS probes, that run with the VO FQAN:

/enmr.eu/ops/Role=NULL/Capability=NULL

_ short : 120 minutes max
_ medium : 12 hours max
_ long : 48 hours

3) A WeNMR supported application, Gromacs, run in multithreading mode on multiprocessor boxes (MPI not needed). Please inform the VO managers if your site does not support this kind of jobs.

4) WeNMR software area must be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS and https://wiki.egi.eu/wiki/PROC22. Please do not forget to define on all WNs the environment variable VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu, as pointed out in the above documents.


epic.vo.gridpp.ac.uk 0 0 0 0


esr 2048 2100 0 0 Many applications only need part of the following. Java/Perl/Python/C/C++/FORTRAN77,-90,-95; IDL and MATLAB runtime; Scilab or Octave. Needs MPI for some applications.

Some applications require access to job output during execution, some even interaction via X11. 1 GB RAM; some applications need 3 GB RAM. Outbound connectivity from WN to databases. Shared file system needed for MPI applications, with about 10 GB of space. There are applications needing about 1000 simultaneously open files. Depending on application, output file sizes from some MB to 5 GB, for a total of several hundred thousand files. No permanent storage needed but transient and durable. Low-latency scheduling for short jobs needed.


fermilab 0 0 0 0


geant4 1000 650 850 300 Software is distributed via CernVM-FS

(http://cernvm.cern.ch/portal/filesystem), configuration should include geant4.cern.ch<http://geant4.cern.ch> and dependency (sft.cern.ch<http://sft.cern.ch>, grid.cern.ch<http://grid.cern.ch>) areas.

CernVM-FS needs to be accessed on WN. CernVM-FS Cache area needed is about 5GB.


gridpp 1000 1000 0 0


hyperk.org 2000 1440 1440 10000


icecube 4000 2880 2880 40000 CVMFS is used for the software distribution via:

/cvmfs/icecube.opensciencegrid.org


ilc 2048 3600 5400 15000 CVMFS is used for the software distribution via:
 /cvmfs/ilc.desy.de 

For setup instructions refer to:

  http://grid.desy.de/cvmfs


ipv6.hepix.org 0 0 0 0


vo.landslides.mossaic.org 0 0 0 0


lhcb 4000 0 0 20000 Further recommendations from LHCb for sites:

The amount of memory in the field "Max used physical non-swap X86_64 memory size" of the resources section is understood to be the virtual memory required per single process of a LHCb payload. Usually LHCb payloads consist of one "worker process", consuming the majority of memory, and several wrapper processes. The total amount of virtual memory for all wrapper processes accounts for 1 GB which needs to be added as a requirement to the field "Max used physical non-swap X86_64 memory size" in case the virtual memory of the whole process tree is monitored.

The amount of space in field "Max size of scratch space used by jobs", shall be interpreted as 50 % each for downloaded input files and produced output files.

Sites having migrated to the Centos7 (including "Cern Centos7") operating system or later versions are requested to provide support for singularity containers and user namespaces. The latter can be checked by ensuring that /proc/sys/user/max_user_namespaces contains a large number.

The underlying OS should provide the libraries, binaries, and scripts required by the current HEP_OSlibs RPM meta package.

The shared software area shall be provided via CVMFS. LHCb uses the mount points /cvmfs/lhcb.cern.ch, /cvmfs/lhcb-condb.cern.ch, /cvmfs/grid.cern.ch and /cvmfs/cernvm-prod.cern.ch on the worker nodes.

Provisioning of a reasonable number of slots per disk server, proportional to the maximum number of concurrent jobs at the site.

Non T1 sites providing CVMFS, direct HTCondorCE, ARC, or CREAM submission and the requested amount of local scratch space will be considered as candidates for additional workloads (e.g. data reprocessing campaign).

Sites not having an SRM installation must provide:

_ disk only storage
_ a GRIDFPT endpoint (a single dns entry)
_ an XROOT endpoint (a single dns entry)
_  a way to do the accounting (preferably following the WLCG TF standard: https://twiki.cern.ch/twiki/bin/view/LCG/StorageSpaceAccounting)


lsst 0 0 0 0 VO name must be "lsst" as it is an existing VO in OSG!

cf VOMS URL


lz 0 0 0 0


magic 1024 5000 0 0 Fortran77 and other compilers. See details in annex of MoU (documentation section).


mice 0 0 0 0


vo.moedal.org 0 0 0 0


na62.vo.gridpp.ac.uk 2048 500 720 2048 VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.cern.ch

Need also access to /cvmfs/geant4.cern.ch and /cvmfs/sft.cern.ch


vo.northgrid.ac.uk 0 0 0 0


ops 0 0 0 0


pheno 0 0 0 0


planck 0 950 0 0 Need access to job output during execution

Need R-GMA for monitoring RAM 1GB Scratch 200GB SE for durable files (not permanent) Java/Perl/Python/C/C++/Fortran90,-95,-77/Octave IDL(commercial) where available


skatelescope.eu 2000 2880 2880 40000


snoplus.snolab.ca 2000 1440 2160 20000 g++

gcc python-devel uuid-devel zlib-devel

SNO+ software area should be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS.


solidexperiment.org 0 0 0 0 will need to set up CVMFS.


vo.southgrid.ac.uk 0 0 0 0


supernemo.org null null null null


t2k.org 1500 600 600 1000 t2k.org software should be mounted on WNs via CVMFS as defined at https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS


zeus 2048 3600 5400 5000 CVMFS is used for the software distribution via:
 /cvmfs/zeus.desy.de 

For setup instructions refer to:

  http://grid.desy.de/cvmfs


Maximum: 4000 5000 5760 40000

VO enablement

The VOs that are enabled at each site are listed in a VO table.

This page is a Key Document, and is the responsibility of Steve Jones. It was last reviewed on 2020-02-24 when it was considered to be 100% complete. It was last judged to be accurate on 2020-02-24.