Difference between revisions of "GridPP approved VOs new"

From GridPP Wiki
Jump to: navigation, search
(Questions)
(VO enablement)
Line 1,800: Line 1,800:
 
[[Category:VOMS]]
 
[[Category:VOMS]]
  
{{KeyDocs|responsible=Gerard Hand|reviewdate=2022-04-01|accuratedate=2020-03-28|percentage=0}}
+
{{KeyDocs|responsible=Gerard Hand|reviewdate=2022-04-01|accuratedate=never|percentage=0}}
  
 
<div style="position:fixed; right:9px; top:88px; width:68px; font-size:1.3em; font-weight:bold; background-color: #f00; color:#fff;z-index: 200;text-align: center;padding: 2px;border: 2px solid #634545;">DRAFT</div>
 
<div style="position:fixed; right:9px; top:88px; width:68px; font-size:1.3em; font-weight:bold; background-color: #f00; color:#fff;z-index: 200;text-align: center;padding: 2px;border: 2px solid #634545;">DRAFT</div>

Revision as of 16:02, 1 April 2022

Questions

Here are a few starting questions I have about the the approved VO pages.

  1. Has the procedure changed for approving VOs?
  2. Is the "Cleanup Campaign" still running.
  3. Do we want just uk contacts or include all contacts in VO data
  4. Who are the missing UK VO contacts
  5. What is the "ops" VO
  6. Do you want email addresses on the contacts? Some have. If so, should they be obfuscated to stop email scraping?
  7. Some local VOs have links to the join url. Should they all?

Introduction

The GridPP Project Management Board has agreed that up to 10 % of GridPP's processing capability should be allocated for non-LHC work. VOs that access the Grid like this must become Approved VOs; policies for managing them are described here: Policies_for_GridPP_approved_VOs.

The tables below indicate VOs that the GridPP PMB has approved, and the PMB encourages support for these VOs at all of its collaborating sites. Information about all European Grid Initiative (EGI), global and local VOs is given in the Operations portal which is the main reference source for VO information (including VO manager, end-points, requirements etc.).

Yum repository

RPM versions of the VOMS records for Approved VOs are available via the VOMS RPMS Yum Repository.

VOMS RPM Repository v1.16-1TODO:Change RPM download location.


Please Note

Please do not change the vomsdir/ or vomses/ entries or the VO Resource Requirements section below, as they are automatically updated from the EGI Operations Portal. Any changes you make will be lost!

Cleanup Campaign

Approved EGI VOs

Name Area Contact
alice LHC experiment at CERN
atlas LHC experiment at CERN James Walder
biomed Medical image processing and biomedical data processing
cms LHC experiment at CERN Daniela Bauer


dteam Default VO for EGI/NGI deployment
esr Earth Science Research covering Solid Earth, Ocean, Atmosphere and their interfaces.
geant4 Geant4 is a Monte Carlo simulation toolkit which emulates the interactions of particles.
lhcb LHC experiment at CERN Mark William Slater
magic Gamma ray telescope - Monte Carlo event production
vo.moedal.org The Monopole and Exotics Detector at LHC experiment at CERN - VO ID card. Tom Whyntie (QMUL)
t2k.org Next Generation Long Baseline Neutrino Oscillation Experiment Ben Still (QMUL)


ops The OPS VO is an infrastructure VO that MUST be enabled by all EGI Resource Centres that support the VO concept
The VOs below are not in the EGI Operations Portal data
planck Satellite project for mapping Cosmic Microwave Background

Approved Global VOs

Name Area Contact
skatelescope.eu SKA European regional data centre VO ID card Andrew McNab (Man)
calice CAlorimeter for the LInear Collider Experiment Roman Poeschl


ilc International Linear Collider project (future electron-positron linear collider studies)
icecube Neutrino experiment at the South Pole (Astronomy, Astrophysics and Astro-Particle Physics) Damian Pieloth,
Alessandra Forti
zeus Particle physics experiment on DESY's electron-proton collider (HERA)
na62.vo.gridpp.ac.uk Another CP violation experiment at CERN Dan Protopopescu (Uni. Glasgow)
ipv6.hepix.org Testing of IPv6 of the middleware, applications and tools (HEP, EGI, middleware technology providers and other infrastructures used by WLCG). Chris Walker,
Dave Kelsey (STFC)
lsst LSST UK Large Synoptic Survey Telescope Gabriele Garzoglio,
Iain Goodenow
The VOs below are not in the EGI Operations Portal data
microboone Low energy neutrino (Fermilab)

Approved Local VOs

Name Area Contact
solidexperiment.org The SoLid experiment Daniela Bauer,
Antonin Vacheret
gridpp (join) GridPP is a collaboration of particle physicists and computer scientists from the UK and CERN Jeremy Coles (Cambridge)
pheno A collaboration of UK Particle Physics Phenomenologists who are developing applications for the LHC Jeppe Andersen,
Adam Boutcher,
Paul Clark (Durham)
mice A neutrino factory experiments Paul Hodgson (Sheffield),
David Colling (Imperial),
Daniela Bauer (Imperial),
Janusz Martyniak
snoplus.snolab.ca A Diverse Instrument for Neutrino Research within the SNOLAB Underground facility Jeanne Wilson (QMU),
Christopher Walker (Queen Mary),
Matthew Mottram (Queen Mary)
vo.northgrid.ac.uk Regional VO to allow access to HEP resources to different local disciplines. Alessandra Forti
Robert Frank (Manchester)
vo.scotgrid.ac.uk The VO is for academic and other users in the ScotGrid region to test access to EGI and GridPP resources. David Crooks
Gareth Roy (Glasgow)
vo.southgrid.ac.uk The VO is for academic and other users in the SouthGrid region to test access to EGI resources. Peter Gronbech
epic.vo.gridpp.ac.uk Veterinary epidemiology in Scotland Thomas Doherty
hyperk.org (join) The Hyper-Kamiokande experiment Christopher Walker (Queen Mary),
Francesca di Lodovico (Queen Mary)
cernatschool.org (join) The CERN@school project. Steve Lloyd (QML),
Tom Whyntie (QML, Langton Star Centre)
The VOs below are not in the EGI Operations Portal data
earthsci.vo.gridpp.ac.uk TBD TBD

Other VOs

This area can be used to record information about VOs that are site specific or localised in a region. This section can be used to advertise a local VO that you would like supported elsewhere.

Name Area Contact
vo.landslides.mossaic.org The landslides VO belongs to the Mossaic project (http://www.bristol.ac.uk/geography/research/hydrology/research/slope/mossiac//). Luke Kreczko (L.Kreczko@bristol.ac.uk)
enmr.eu unk unk

Approved VOs being established into GridPP infrastructure

As part of its commitment to various projects, the GridPP PMB has approved the establishment of the following VOs (your site can not yet support these but when the VO is setup and functioning we will let you know.)

Name Area Contact
LZ LZ Dark Matter Experiment Daniela Bauer,
Elena Korolkova,
Dan Bradley
supernemo.org Searching for Neutrinoless Double Beta Decay Ben Morgan,
Jens Jensen,
Paolo Franchini
fermilab Umbrella VO for Fermilab Gabriele Garzoglio,
Alessandra Forti
dune Deep Underground Neutrino Experiment Elena Korelkova,
Wenlong Yaun
The VOs below are not yet fully synced from the EGI Operations Portal data
dune All VOMS servers added by hand Andrew McNab,
Steve Timm
virgo

VOs that have been removed from approved list

The table below comprises a history of VOs that have been removed from the approved list for various reasons.

Name Date of removal Notes
babar 9 Oct 2013 none
camont 7th June 2017 none
camont.gridpp.ac.uk 9 Oct 2013 none
cdf 7th June 2017 none
cedar 9 Oct 2013 none
dzero 7th June 2017 none
fusion 30 Jan 2017 Discussion with Rubén Vallés Pérez. VO appears defunct.
hone 24 Nov 2015 Discussed at Ops Meeting. Defunct.
ltwo 9 Oct 2013 none
minos.vo.gridpp.ac.uk 9 Oct 2013 none
na48 9 Oct 2013 none
neiss 7th June 2017 none


ngs.ac.uk 9 Oct 2013 none
superbvo.org 19 Jan 2016 Discussed at Ops Meeting. Defunct.
supernemo.vo.eu-egee.org 24 Feb 2020 now called supernemo.org
totalep 9 Oct 2013 none
vo.londongrid.ac.uk in progress [GGUS] VO not used any more
vo.sixt.cern.ch 11 Nov 2015 No members, no voms servers, defunct

Example site-info.def entries

The examples of site-info.def entries for yaim have been moved: Example site-info.def entries

Please Note

Please do not change the vomsdir/ or vomses/ entries below, as they are automatically updated from the EGI Operations Portal. Any changes you make will be lost!


Virtual Organisation: ALICE

Filename: /etc/grid-security/vomsdir/alice/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/alice/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/alice-lcg-voms2.cern.ch

"alice" "lcg-voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "alice"

Filename: /etc/vomses/alice-voms2.cern.ch

"alice" "voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "alice"

Notes: n/a


Virtual Organisation: ATLAS

Filename: /etc/grid-security/vomsdir/atlas/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/atlas/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/atlas/voms-atlas-auth.app.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/atlas-lcg-voms2.cern.ch

"atlas" "lcg-voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "atlas"

Filename: /etc/vomses/atlas-voms2.cern.ch

"atlas" "voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "atlas"

Filename: /etc/vomses/atlas-voms-atlas-auth.app.cern.ch

"atlas" "voms-atlas-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch" "atlas"

Notes: n/a


Virtual Organisation: BIOMED

Filename: /etc/grid-security/vomsdir/biomed/cclcgvomsli01.in2p3.fr.lsc

/O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr
/C=FR/O=MENESR/OU=GRID-FR/CN=AC GRID-FR Services

Filename: /etc/vomses/biomed-cclcgvomsli01.in2p3.fr

"biomed" "cclcgvomsli01.in2p3.fr" "15000" "/O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr" "biomed"

Notes: n/a


Virtual Organisation: CALICE

Filename: /etc/grid-security/vomsdir/calice/grid-voms.desy.de.lsc

/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
/C=DE/O=GermanGrid/CN=GridKa-CA

Filename: /etc/vomses/calice-grid-voms.desy.de

"calice" "grid-voms.desy.de" "15102" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "calice"

Notes: n/a


Virtual Organisation: CMS

Filename: /etc/grid-security/vomsdir/cms/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/cms/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/cms/voms-cms-auth.app.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/cms-lcg-voms2.cern.ch

"cms" "lcg-voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "cms"

Filename: /etc/vomses/cms-voms2.cern.ch

"cms" "voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "cms"

Filename: /etc/vomses/cms-voms-cms-auth.app.cern.ch

"cms" "voms-cms-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch" "cms"

Notes: n/a


Virtual Organisation: DTEAM

Filename: /etc/grid-security/vomsdir/dteam/voms2.hellasgrid.gr.lsc

/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr
/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016

Filename: /etc/vomses/dteam-voms2.hellasgrid.gr

"dteam" "voms2.hellasgrid.gr" "15004" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "dteam"

Notes: n/a


Virtual Organisation: ENMR.EU

Filename: /etc/grid-security/vomsdir/enmr.eu/voms-02.pd.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms-02.pd.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/grid-security/vomsdir/enmr.eu/voms2.cnaf.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms2.cnaf.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/enmr.eu-voms-02.pd.infn.it

"enmr.eu" "voms-02.pd.infn.it" "15014" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms-02.pd.infn.it" "enmr.eu"

Filename: /etc/vomses/enmr.eu-voms2.cnaf.infn.it

"enmr.eu" "voms2.cnaf.infn.it" "15014" "/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms2.cnaf.infn.it" "enmr.eu"

Notes: n/a


Virtual Organisation: ESR

Filename: /etc/grid-security/vomsdir/esr/voms.grid.sara.nl.lsc

/DC=org/DC=terena/DC=tcs/C=NL/L=Utrecht/O=SURF B.V./CN=voms1.grid.surfsara.nl
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/esr-voms.grid.sara.nl

"esr" "voms.grid.sara.nl" "30001" "/DC=org/DC=terena/DC=tcs/C=NL/L=Utrecht/O=SURF B.V./CN=voms1.grid.surfsara.nl" "esr"

Notes: n/a


Virtual Organisation: GEANT4

Filename: /etc/grid-security/vomsdir/geant4/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/geant4/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/geant4-lcg-voms2.cern.ch

"geant4" "lcg-voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "geant4"

Filename: /etc/vomses/geant4-voms2.cern.ch

"geant4" "voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "geant4"

Notes: n/a


Virtual Organisation: GRIDPP

Filename: /etc/grid-security/vomsdir/gridpp/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/gridpp/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/gridpp/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/gridpp-voms.gridpp.ac.uk

"gridpp" "voms.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "gridpp"

Filename: /etc/vomses/gridpp-voms02.gridpp.ac.uk

"gridpp" "voms02.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "gridpp"

Filename: /etc/vomses/gridpp-voms03.gridpp.ac.uk

"gridpp" "voms03.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "gridpp"

Notes: n/a


Virtual Organisation: ILC

Filename: /etc/grid-security/vomsdir/ilc/grid-voms.desy.de.lsc

/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
/C=DE/O=GermanGrid/CN=GridKa-CA

Filename: /etc/vomses/ilc-grid-voms.desy.de

"ilc" "grid-voms.desy.de" "15110" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "ilc"

Notes: n/a


Virtual Organisation: LHCB

Filename: /etc/grid-security/vomsdir/lhcb/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/lhcb/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/lhcb-lcg-voms2.cern.ch

"lhcb" "lcg-voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "lhcb"

Filename: /etc/vomses/lhcb-voms2.cern.ch

"lhcb" "voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "lhcb"

Notes: n/a


Virtual Organisation: MAGIC

Notes: n/a


Virtual Organisation: OPS

Filename: /etc/grid-security/vomsdir/ops/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/ops/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/ops-lcg-voms2.cern.ch

"ops" "lcg-voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "ops"

Filename: /etc/vomses/ops-voms2.cern.ch

"ops" "voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "ops"

Notes: n/a


Virtual Organisation: PHENO

Filename: /etc/grid-security/vomsdir/pheno/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/pheno/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/pheno/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/pheno-voms.gridpp.ac.uk

"pheno" "voms.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "pheno"

Filename: /etc/vomses/pheno-voms02.gridpp.ac.uk

"pheno" "voms02.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "pheno"

Filename: /etc/vomses/pheno-voms03.gridpp.ac.uk

"pheno" "voms03.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "pheno"

Notes: n/a


Virtual Organisation: SNOPLUS.SNOLAB.CA

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/snoplus.snolab.ca-voms.gridpp.ac.uk

"snoplus.snolab.ca" "voms.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "snoplus.snolab.ca"

Filename: /etc/vomses/snoplus.snolab.ca-voms02.gridpp.ac.uk

"snoplus.snolab.ca" "voms02.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "snoplus.snolab.ca"

Filename: /etc/vomses/snoplus.snolab.ca-voms03.gridpp.ac.uk

"snoplus.snolab.ca" "voms03.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "snoplus.snolab.ca"

Notes: n/a


Virtual Organisation: T2K.ORG

Filename: /etc/grid-security/vomsdir/t2k.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/t2k.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/t2k.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/t2k.org-voms.gridpp.ac.uk

"t2k.org" "voms.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "t2k.org"

Filename: /etc/vomses/t2k.org-voms02.gridpp.ac.uk

"t2k.org" "voms02.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "t2k.org"

Filename: /etc/vomses/t2k.org-voms03.gridpp.ac.uk

"t2k.org" "voms03.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "t2k.org"

Notes: n/a


Virtual Organisation: VO.NORTHGRID.AC.UK

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.northgrid.ac.uk-voms.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.northgrid.ac.uk"

Filename: /etc/vomses/vo.northgrid.ac.uk-voms02.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms02.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.northgrid.ac.uk"

Filename: /etc/vomses/vo.northgrid.ac.uk-voms03.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms03.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.northgrid.ac.uk"

Notes: n/a


Virtual Organisation: VO.SCOTGRID.AC.UK

Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.scotgrid.ac.uk-voms.gridpp.ac.uk

"vo.scotgrid.ac.uk" "voms.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.scotgrid.ac.uk"

Filename: /etc/vomses/vo.scotgrid.ac.uk-voms02.gridpp.ac.uk

"vo.scotgrid.ac.uk" "voms02.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.scotgrid.ac.uk"

Filename: /etc/vomses/vo.scotgrid.ac.uk-voms03.gridpp.ac.uk

"vo.scotgrid.ac.uk" "voms03.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.scotgrid.ac.uk"

Notes: n/a


Virtual Organisation: ZEUS

Filename: /etc/grid-security/vomsdir/zeus/grid-voms.desy.de.lsc

/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
/C=DE/O=GermanGrid/CN=GridKa-CA

Filename: /etc/vomses/zeus-grid-voms.desy.de

"zeus" "grid-voms.desy.de" "15112" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "zeus"

Notes: n/a


Virtual Organisation: MICE

Filename: /etc/grid-security/vomsdir/mice/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/mice/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/mice/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/mice-voms.gridpp.ac.uk

"mice" "voms.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "mice"

Filename: /etc/vomses/mice-voms02.gridpp.ac.uk

"mice" "voms02.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "mice"

Filename: /etc/vomses/mice-voms03.gridpp.ac.uk

"mice" "voms03.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "mice"

Notes: n/a


Virtual Organisation: VO.LANDSLIDES.MOSSAIC.ORG

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.landslides.mossaic.org-voms.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.landslides.mossaic.org"

Filename: /etc/vomses/vo.landslides.mossaic.org-voms02.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms02.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.landslides.mossaic.org"

Filename: /etc/vomses/vo.landslides.mossaic.org-voms03.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms03.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.landslides.mossaic.org"

Notes: n/a


Virtual Organisation: IPV6.HEPIX.ORG

Filename: /etc/grid-security/vomsdir/ipv6.hepix.org/voms2.cnaf.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms2.cnaf.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/ipv6.hepix.org-voms2.cnaf.infn.it

"ipv6.hepix.org" "voms2.cnaf.infn.it" "15013" "/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms2.cnaf.infn.it" "ipv6.hepix.org"

Notes: n/a


Virtual Organisation: NA62.VO.GRIDPP.AC.UK

Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms.gridpp.ac.uk

"na62.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"

Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms02.gridpp.ac.uk

"na62.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"

Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms03.gridpp.ac.uk

"na62.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"

Notes: n/a


Virtual Organisation: EPIC.VO.GRIDPP.AC.UK

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15507" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms02.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms03.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Notes: n/a


Virtual Organisation: LSST

Filename: /etc/grid-security/vomsdir/lsst/voms.slac.stanford.edu.lsc

/DC=org/DC=incommon/C=US/ST=California/L=Stanford/O=Stanford University/OU=SLAC/CN=voms.slac.stanford.edu
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/vomses/lsst-voms.slac.stanford.edu

"lsst" "voms.slac.stanford.edu" "15003" "/DC=org/DC=incommon/C=US/ST=California/L=Stanford/O=Stanford University/OU=SLAC/CN=voms.slac.stanford.edu" "lsst"

Notes: n/a


Virtual Organisation: HYPERK.ORG

Filename: /etc/grid-security/vomsdir/hyperk.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/hyperk.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/hyperk.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/hyperk.org-voms.gridpp.ac.uk

"hyperk.org" "voms.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "hyperk.org"

Filename: /etc/vomses/hyperk.org-voms02.gridpp.ac.uk

"hyperk.org" "voms02.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "hyperk.org"

Filename: /etc/vomses/hyperk.org-voms03.gridpp.ac.uk

"hyperk.org" "voms03.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "hyperk.org"

Notes: n/a


Virtual Organisation: FERMILAB

Filename: /etc/grid-security/vomsdir/fermilab/voms1.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/grid-security/vomsdir/fermilab/voms2.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/vomses/fermilab-voms1.fnal.gov

"fermilab" "voms1.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov" "fermilab"

Filename: /etc/vomses/fermilab-voms2.fnal.gov

"fermilab" "voms2.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov" "fermilab"

Notes: n/a


Virtual Organisation: VO.MOEDAL.ORG

Filename: /etc/grid-security/vomsdir/vo.moedal.org/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/vo.moedal.org/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/vo.moedal.org-lcg-voms2.cern.ch

"vo.moedal.org" "lcg-voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "vo.moedal.org"

Filename: /etc/vomses/vo.moedal.org-voms2.cern.ch

"vo.moedal.org" "voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "vo.moedal.org"

Notes: n/a


Virtual Organisation: SKATELESCOPE.EU

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/skatelescope.eu-voms.gridpp.ac.uk

"skatelescope.eu" "voms.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "skatelescope.eu"

Filename: /etc/vomses/skatelescope.eu-voms02.gridpp.ac.uk

"skatelescope.eu" "voms02.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "skatelescope.eu"

Filename: /etc/vomses/skatelescope.eu-voms03.gridpp.ac.uk

"skatelescope.eu" "voms03.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "skatelescope.eu"

Notes: n/a


Virtual Organisation: DUNE

Filename: /etc/grid-security/vomsdir/dune/voms1.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/grid-security/vomsdir/dune/voms2.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/vomses/dune-voms1.fnal.gov

"dune" "voms1.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov" "dune"

Filename: /etc/vomses/dune-voms2.fnal.gov

"dune" "voms2.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov" "dune"

Notes: n/a


Virtual Organisation: SUPERNEMO.ORG

Filename: /etc/grid-security/vomsdir/supernemo.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/supernemo.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/supernemo.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/supernemo.org-voms.gridpp.ac.uk

"supernemo.org" "voms.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "supernemo.org"

Filename: /etc/vomses/supernemo.org-voms02.gridpp.ac.uk

"supernemo.org" "voms02.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "supernemo.org"

Filename: /etc/vomses/supernemo.org-voms03.gridpp.ac.uk

"supernemo.org" "voms03.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "supernemo.org"

Notes: n/a




VO Resource Requirements

Please Note

Please do not change the table below as it is automatically updated from the EGI Operations Portal. Any changes you make will be lost.


VO Ram/Core MaxCPU MaxWall Scratch Other
alice 2000 1320 1500 10000
atlas 2048 5760 5760 20000 Additional runtime requirements:
  • at least 4GB of VM for each job slot

Software installation common items:

  • the full compiler suite (c/c++ and fortran) should be installed in the WNs, including all the compat-gcc-32* and the SL_libg2c.a_change packages in SL4-like nodes;
  • the reccommended version of the compilers is 3.4.6;
  • the f2c and libgfortran libraries (in both i386 and x86_64 versions, in case of x86_64 systems) are also required to run the software;
  • other libraries required are:
libpopt.so.0
libblas.so

Software installation setup (cvmfs sites):

Software installation requirements (non-cvmfs sites):

  • an experimental software area (shared filesystem) with at least 500 GB free and reserved for ATLAS.
biomed 100 1 1 100 For sites providing an SE, minimal required storage space is 1TB.
calice 2048 3600 5400 15000 CVMFS is used for the software distribution via:
/cvmfs/calice.desy.de

For setup instructions refer to:

http://grid.desy.de/cvmfs
cms 2000 2880 4320 20000 Note: CMS usually sends 8-core pilots, values for 'Multi Core' refer to that. Single-core pilots are discouraged.

Jobs require an address space larger than the memory size specified above. Sites should allow processes to use at least 6GB of virtual address space more per core than memory to accommodate the large amount of shared libraries used by jobs. (For a typical 8-core pilot that would translate into a VZSIZE limit of at least 64GB.)

Cloud resources should provision 8-core VMs to match standard 8-core pilots.

Input I/O requirement is an average 2.5 MB/s per thread from MSS.

All jobs need to have outbound connectivity.

Sites must not use pool accounts for the FQAN cms:/cms/Role=lcgadmin . For any other CMS job, sites need to use pool accounts so that at any time every grid credential is mapped to an independent local account.


National VOMS groups: In CMS national VOMS groups, e.g. /cms/becms or /cms/dcms, are used. Those proxies must be "supported" at all sites in the following way:

  • should be treated like /cms (base group), in case no special treated is wanted by the site
  • proxies with such national groups must be able to write to /store/user/temp (the PFN associated to this LFN)
dteam None None None None
enmr.eu 8000 2880 4320 1000
  1. For COVID-19 related jobs, slots with 8 GB/Core are required
  2. WeNMR software area must be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS and https://wiki.egi.eu/wiki/PROC22. Please do not forget to define on all WNs the environment variable VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu, as pointed out in the above documents.
  3. The line:

"/enmr.eu/*":::: has to be added to group.conf file before configuring via yaim the grid services. In the CREAM-CE this reflects in the lines: "/enmr.eu/*/Role=NULL/Capability=NULL" .enmr "/enmr.eu/*" .enmr of both the files /etc/grid-security/grid-mapfile and /etc/grid-security/voms-grid-mapfile, and in the lines: "/enmr.eu/*/Role=NULL/Capability=NULL" enmr "/enmr.eu/*" enmr of the file /etc/grid-security/groupmapfile. It is required to enable whatever VO group added for implementing per-application accounting.

esr 2048 2100 0 0 Many applications only need part of the following. Java/Perl/Python/C/C++/FORTRAN77,-90,-95; IDL and MATLAB runtime; Scilab or Octave. Needs MPI for some applications.

Some applications require access to job output during execution, some even interaction via X11. 1 GB RAM; some applications need 3 GB RAM. Outbound connectivity from WN to databases. Shared file system needed for MPI applications, with about 10 GB of space. There are applications needing about 1000 simultaneously open files. Depending on application, output file sizes from some MB to 5 GB, for a total of several hundred thousand files. No permanent storage needed but transient and durable. Low-latency scheduling for short jobs needed.

geant4 1000 650 850 300 Software is distributed via CernVM-FS

(http://cernvm.cern.ch/portal/filesystem), configuration should include geant4.cern.ch<http://geant4.cern.ch> and dependency (sft.cern.ch<http://sft.cern.ch>, grid.cern.ch<http://grid.cern.ch>) areas.

CernVM-FS needs to be accessed on WN. CernVM-FS Cache area needed is about 5GB.

gridpp 1000 1000 0 0
ilc 2048 3600 5400 15000 CVMFS is used for the software distribution via:
/cvmfs/ilc.desy.de

For setup instructions refer to:

http://grid.desy.de/cvmfs
lhcb 0 0 0 20000 Further recommendations from LHCb for sites:

The amount of memory in the field "Max used physical non-swap X86_64 memory size" of the resources section is understood to be the virtual memory required per single process of a LHCb payload. Usually LHCb payloads consist of one "worker process", consuming the majority of memory, and several wrapper processes. The total amount of virtual memory for all wrapper processes accounts for 1 GB which needs to be added as a requirement to the field "Max used physical non-swap X86_64 memory size" in case the virtual memory of the whole process tree is monitored.

The amount of space in field "Max size of scratch space used by jobs", shall be interpreted as 50 % each for downloaded input files and produced output files.

Sites should have the Centos7 or "Cern Centos7" operating system, or later versions, installed on their worker nodes. Sites are requested to provide support for singularity containers and user namespaces. The latter can be checked by ensuring that /proc/sys/user/max_user_namespaces contains a large number.

The underlying OS should provide the libraries, binaries, and scripts required by the current HEP_OSlibs RPM meta package.

The shared software area shall be provided via CVMFS. LHCb uses the mount points /cvmfs/lhcb.cern.ch, /cvmfs/lhcb-condb.cern.ch, /cvmfs/grid.cern.ch and /cvmfs/cernvm-prod.cern.ch on the worker nodes.

Provisioning of a reasonable number of slots per disk server, proportional to the maximum number of concurrent jobs at the site.

Non T1 sites providing CVMFS, direct HTCondorCE, ARC, or CREAM submission and the requested amount of local scratch space will be considered as candidates for additional workloads (e.g. data reprocessing campaign).

Sites not having an SRM installation must provide:

magic 1024 5000 0 0 Fortran77 and other compilers. See details in annex of MoU (documentation section).
ops 0 0 0 0
pheno 0 0 0 0
snoplus.snolab.ca 2000 1440 2160 20000 g++

gcc python-devel uuid-devel zlib-devel

SNO+ software area should be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS.

t2k.org 1500 600 600 1000 t2k.org software should be mounted on WNs via CVMFS as defined at https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS
vo.northgrid.ac.uk 0 0 0 0
vo.scotgrid.ac.uk 0 0 0 0
zeus 2048 3600 5400 5000 CVMFS is used for the software distribution via:
/cvmfs/zeus.desy.de

For setup instructions refer to:

http://grid.desy.de/cvmfs
mice 0 0 0 0
vo.landslides.mossaic.org 0 0 0 0
ipv6.hepix.org 0 0 0 0
na62.vo.gridpp.ac.uk 2048 500 720 2048 VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.cern.ch

Need also access to /cvmfs/geant4.cern.ch and /cvmfs/sft.cern.ch

epic.vo.gridpp.ac.uk 0 0 0 0
lsst 0 0 0 0 VO name must be "lsst" as it is an existing VO in OSG!

cf VOMS URL

hyperk.org 0 1440 1440 10000
fermilab 0 0 0 0
vo.moedal.org 0 0 0 0
skatelescope.eu None None None None
dune 0 2880 2880 10000
supernemo.org None None None None


VO enablement

The VOs that are enabled at each site are listed in a VO table.

This page is a Key Document, and is the responsibility of Gerard Hand. It was last reviewed on 2022-04-01 when it was considered to be 0% complete. It was last judged to be accurate on never.

DRAFT