Difference between revisions of "GridPP approved VOs"

From GridPP Wiki
Jump to: navigation, search
(VO enablement: table at steve lloyd)
(ENMR.EU & VIRGO VOMS Records Updates)
 
(171 intermediate revisions by 7 users not shown)
Line 1: Line 1:
The GridPP Project Management Board has agreed that up to 10 % of GridPP's processing capability should be allocated for non-LHC work. VOs that access the Grid like this must become [[Policies_for_GridPP_approved_VOs|Approved VOs]]; policies for managing them are described here:  [[Policies_for_GridPP_approved_VOs]].
+
== Introduction ==
  
The tables below indicate VOs that the GridPP PMB has approved, and the PMB encourages support for these VOs at all of its collaborating sites. Information about all European Grid Initiative (EGI), global and local VOs is given in the [http://operations-portal.egi.eu/ Operations portal] which is the main reference source for VO information (including VO manager, end-points, requirements etc.).
+
The GridPP Project Management Board ('''PMB''') has agreed that up to 10 % of GridPP's processing capability should be allocated for non-LHC work. VOs that access the Grid like this must become Approved VOs; policies for managing approved VOs are described here:  [[Policies_for_GridPP_approved_VOs]].
  
'''Yum repository'''
+
The tables below indicate VOs that the GridPP Project Management Board has approved, and the PMB encourages support for these VOs at all of its collaborating sites. Information about all European Grid Initiative ('''EGI'''), global and local VOs is given in the [http://operations-portal.egi.eu/ EGI Operations portal] which is the main reference source for VO information (including VO manager, end-points, requirements etc.).
  
RPM versions of the VOMS records for Approved VOs should be available via the [http://hep.ph.liv.ac.uk/~sjones/RPMS.voms/ VOMS RPMS Yum Repository]. The latest version, which is consistent with the Yaim records listed below, will be 1.1-1.
+
<!--
 +
== Yum repository ==
  
 +
RPM versions of the VOMS records for Approved VOs are available via the VOMS RPMS Yum Repository.
  
'''NOTA BENE'''
+
<div style="text-align:center">[http://hep.ph.liv.ac.uk/~sjones/RPMS.voms/ VOMS RPM Repository v1.16-1]
 +
</div>
 +
-->
  
Some sections in this document are automatically updated from the CIC Portal (approximately once a week). Please do not change the tables in the '''VO Yaim Records''' or the '''VO Resource Requirements''' sections.
+
 
 +
<div style="margin:auto; border:2px solid black;background-color:#EEEEEE;width:600px; max-width:97%">
 +
<div style="font-size:1.2em; font-weight:bold; padding-left:4px;background-color:#7C8AAF;color:#fff;">Please Note</div>
 +
<div style="padding:3px 6px">
 +
Please do not change the '''vomsdir/''' or '''vomses/''' entries or the '''VO Resource Requirements''' section below, as they are automatically updated from the EGI Operations Portal. Any changes you make will be lost!
 +
</div>
 +
</div>
 +
 
 +
<!--
 +
== Cleanup Campaign ==
  
 
* [[VO Cleanup Campaign]]
 
* [[VO Cleanup Campaign]]
 +
-->
  
 +
==Approved VOs==
  
==Approved EGI VOs==
+
{|border="1" cellpadding="3" style="border-collapse:collapse;margin-bottom:40px;width:100%;"
 
+
<!-- |+Approved VOs] -->
{|border="1" cellpadding="1"  
+
|+Approved [http://wiki.egee-see.org/index.php/Extended_VO_Information EGEE VOs]  
+
 
|-style="background:#7C8AAF;color:white"
 
|-style="background:#7C8AAF;color:white"
 
!Name
 
!Name
 
!Area
 
!Area
 
!Contact
 
!Contact
 
+
<!-- start main approved list -->
 
+
 
|-
 
|-
|[http://aliceinfo.cern.ch alice]
+
|[https://alice-collaboration.web.cern.ch/ alice]
|LHC experiment at CERN
+
|The ALICE Collaboration is operating a dedicated heavy-ion detector to exploit the unique physics potential of nucleus-nucleus interactions at LHC energies. Our aim is to study the physics of strongly interacting matter at extreme energy densities, where the formation of a new phase of matter, the quark-gluon plasma, is expected.
|
+
|{{@|Latchezar.Betev|cern.ch}}<br>{{@|Maarten.Litmaath|cern.ch}}<br>{{@|costin.grigoras|cern.ch}}
  
 
|-
 
|-
|[http://atlas.web.cern.ch/Atlas/index.html atlas]
+
|[http://bes.ihep.ac.cn/ bes]
|LHC experiment at CERN
+
|Beijing Spectrometer (BES) is a general-purpose detector located in the interaction region of the BEPC storage ring, where the electron and positron beams collide.  The BES Collaboration consists of approximately 200 physicists and engineers from 27 institutions in 4 countries.
 
|
 
|
  
 
|-
 
|-
|[http://egee-na4.ct.infn.it/biomed/ biomed]
+
|[http://lsgc.org/biomed.html biomed]
|Medical image processing and biomedical data processing
+
|This VO covers the areas related to health and life sciences. Currently, it is divided into 3 sectors: medical imaging, bioinformatics and drug discovery. The VO is openly accessible to academics, and to private company for non-commercial purposes.
|
+
|{{@|glatard|creatis.insa-lyon.fr}}<br>{{@|jerome.pansanel|iphc.cnrs.fr}}<br>{{@|sorina.pop|creatis.insa-lyon.fr}}<br>{{@|glatard|creatis.insa-lyon.fr}}
  
 
|-
 
|-
|[http://cmsinfo.cern.ch/Welcome.html cms]
+
|[https://twiki.cern.ch/twiki/bin/view/CALICE/ calice]
|LHC experiment at CERN
+
|CAlorimeter for the LInear Collider  Experiment
|
+
  
 +
A high granularity calorimeter optimised for the Particle Flow measurement of multi-jets final state at the International Linear Collider running at a center-of-mass between 90 GeV and 1 TeV.
 +
|{{@|thomas.hartmann|desy.de}}<br>{{@|andreas.gellrich|desy.de}}
  
 
|-
 
|-
|[http://it-div-gd.web.cern.ch/it-div-gd/ dteam]
+
|[http://cepc.ihep.ac.cn/ cepc]
|Default VO for EGI/NGI deployment
+
|The Circular Electron Positron Collider (CEPC) is a large international scientific facility proposed by the Chinese particle physics community in 2012
 
|
 
|
  
 
|-
 
|-
|[http://datagrid.nadc.nl/twiki/bin/view/ESR/WebHome esr]
+
|[http://researchinschools.org/CERN/ cernatschool.org]
|Earth Science Research covering Solid Earth, Ocean, Atmosphere and their interfaces.
+
|The CERN@school VO represents the CERN@school project on the Grid. CERN@school aims to bring CERN technology into the classroom to aid with the teaching of physics and to inspire the next generation of scientists and engineers. The CERN@school VO will allow students and teachers involved with the project to harness GridPP to store and analyse data from the CERN@school detectors, the LUCID experiment and the associated GEANT4 simulations.
 
|
 
|
  
 
|-
 
|-
|[https://listserv.physics.utoronto.ca/pipermail/lcg-admin/2005-November/000017.html geant4]
+
|[https://www.jlab.org/physics/hall-b/clas12 clas12]
|Geant4 is a Monte Carlo simulation toolkit which emulates the interactions of particles.
+
|
 
|
 
|
  
 
|-
 
|-
|[http://lhcb.web.cern.ch/lhcb/ lhcb]
+
|[http://wiki.grid.auth.gr/wiki/bin/view/ComplexityScienceSSC/VO vo.complex-systems.eu]
|LHC experiment at CERN
+
|The goal of the vo.complex-systems.eu is to promote the study of
|
+
complex systems and complex networks on the Grid infrastructure. The
 +
vo.complex-systems.eu Virtual Organization will also serve as the
 +
building layer of collaboration among international scientists
 +
focusing on the research area of Complexity Science.
 +
|{{@|romain.reuillon|iscpif.fr}}
  
 
|-
 
|-
|[http://savannah.fzk.de/websites/fzk/magic/content.htm magic]
+
|[http://comet.kek.jp comet.j-parc.jp]
|[http://wwwmagic.mppmu.mpg.de/ Gamma ray telescope] - Monte Carlo event production
+
|Muon-to-electron conversion experiment at J-PARC, which will be used by international COMET collaborators for design studies and data analysis. COMET will test Beyond-the-Standard-Model physics in a way that is complementary to the experiments at the LHC.
|
+
|{{@|daniela.bauer|imperial.ac.uk}}<br>{{@|Yoshi.Uchida|imperial.ac.uk}}<br>{{@|simon.fayer05|imperial.ac.uk}}
  
 
|-
 
|-
|[http://moedal.web.cern.ch vo.moedal.org]
+
|[https://confluence.egi.eu/display/EGIPP/DTEAM+VO dteam]
|The Monopole and Exotics Detector at LHC experiment at [http://home.cern CERN] - [https://operations-portal.egi.eu/vo/view/voname/vo.moedal.org VO ID card] - [https://voms2.cern.ch:8443/voms/vo.moedal.org/ VOMS server].
+
|The goal of the VO is to facilitate the deployment of a stable production Grid infrastructure. To this end, members of this VO (who have to be associated with a registered site and be involved in its operation) are allowed to run tests to validate the correct configuration of their site. Site performance evaluation and/or monitoring programs may also be run under the DTEAM VO with the approval of the Site Manager, subject to the agreement of the affected sites' management.
|Tom Whyntie (QMUL)
+
|{{@|kkoum|admin.grnet.gr}}<br>{{@|alessandro.paolini|egi.eu}}<br>{{@|matthew.viljoen|egi.eu}}<br>{{@|kyrginis|admin.grnet.gr}}
  
 
|-
 
|-
|[http://wwwas.oat.ts.astro.it/planck-egee/planck.htm planck]
+
|[http://www.wenmr.eu enmr.eu]
|Satellite project for mapping Cosmic Microwave Background
+
|Structural biology and life sciences in general, and NMR in particular, have always been associated with advanced computing. The current challenges in the post-genomic era call for virtual research platforms that provide the worldwide research community with both user-friendly tools, platforms for data analysis and exchange, and an underlying e-infrastructure. WeNMR groups different research teams into a worldwide virtual research community. It builds on the established eNMR e-Infrastructure and its steadily growing virtual organization, which is currently the second largest VO in the area of life sciences. WeNMR provides an e-Infrastructure platform and Science Gateway for structural biology towards EGI for the users of existing infrastructures. It involves researchers from around the world and will build bridges to other areas of structural biology. Integration with SAXS, a rapidly growing and highly complementary method, is directly included in WeNMR, but links will also be established to related initiatives. WeNMR will serve all relevant INSTRUCT communities in line with the ESFRI roadmap.
|
+
|{{@|Marco.Verlato|pd.infn.it}}<br>{{@|a.m.j.j.bonvin|uu.nl}}<br>{{@|rosato|cerm.unifi.it}}<br>{{@|giachetti|cerm.unifi.it}}<br>{{@|verlato|infn.it}}
  
 
|-
 
|-
|[http://www.t2k.org t2k.org]
+
|[http://www.sruc.ac.uk/epic/ epic.vo.gridpp.ac.uk]
|Next Generation Long Baseline Neutrino Oscillation Experiment
+
|EPIC replaces an earlier EPIC project that was focused upon Veterinary Surveillance (Phase 1). This new consortium EPIC project aims to become a world leader in policy linked research and includes some of Scotland’s leading veterinary epidemiologists and scientists.
|Ben Still (QMUL)
+
  
 +
The overarching purpose for the Centre is to provide access to high quality advice and analyses on the epidemiology and control of animal diseases that are important to Scotland, and to best prepare Scotland for the next major disease incursion. Ultimately, this strategic advice to the Scottish Government will help ensure that the interests of the various stakeholders involved in disease emergency planning and response are met as effectively as possible. This all must be achieved within the context of our rapidly changing environment. For example, issues such as climate change are now influencing the livestock disease risks that Scotland faces.
 +
|{{@|thomas.doherty|glasgow.ac.uk}}
  
 
|-
 
|-
|[https://wiki.egi.eu/wiki/OPS_vo ops]
+
|[http://www.euearthsciencegrid.org/content/esr-vo-introduction esr]
|The OPS VO is an infrastructure VO that MUST be enabled by all EGI Resource Centres that support the VO concept
+
|The Earth Science Research covers research in the fields of Solid Earth, Ocean, Atmosphere and their interfaces. A large variety of communities correspond to each domain, some of them covering several domains.
|
+
|}
+
  
== Approved Global VOs==
 
  
  
{|border="1" cellpadding="1"
+
In the ESR Virtual Organization (ESR-VO) four domains are represented:
|+Approved Global VOs
+
|-style="background:#7C8AAF;color:white"
+
!Name
+
!Area
+
!Contact
+
  
|-
+
  1. Earth Observation
|[http://polywww.in2p3.fr/activites/physique/flc/calice.htmlD calice]
+
|CAlorimeter for the LInear Collider  Experiment
+
|Roman  Poeschl
+
  
 +
  2. Climate
 +
 +
  3. Hydrology
 +
 +
  4. Solid Earth Physics
 +
|{{@|andre.gemuend|scai.fraunhofer.de}}<br>{{@|weissenb|ccr.jussieu.fr}}<br>{{@|weissenb|ccr.jussieu.fr}}<br>{{@|weissenb|ccr.jussieu.fr}}
  
 
|-
 
|-
|[http://www-cdf.fnal.gov/ cdf]
+
|[http://www.fnal.gov fermilab]
|Particle physics detector at Fermilab
+
|Fermilab Virtual Organization (VO) - The Fermilab VO is an "umbrella" VO that includes the Fermilab Campus Grid (FermiGrid) and Fermilab Grid Testing (ITB) infrastructures, and all Fermilab computing activities that are not big enough to have their own Virtual Organization. Broadly these include the intensity frontier program, theoretical simulations, fixed target analysis, and accelerator and beamline design as well as activities performed by the Fermilab Campus Grid administrators.
|
+
|{{@|garzoglio|fnal.gov}}<br>{{@|boyd|fnal.gov}}
  
 
|-
 
|-
|[http://www-d0.fnal.gov/ dzero]
+
|[http://geant4.web.cern.ch/geant4/ geant4]
|D0 - particle physics detector for proton antiproton collision studies
+
|Geant4 is a toolkit for the simulation of the passage of particles through matter. Its areas of application include high energy, nuclear and accelerator physics, as well as studies in medical and space science. The two main reference papers for Geant4 are published in Nuclear Instruments and Methods in Physics Research  A 506 (2003) 250-303, and IEEE Transactions on Nuclear Science 53 No. 1 (2006) 270-278.
|
+
|{{@|Andrea.Sciaba|cern.ch}}<br>{{@|Andrea.Sciaba|cern.ch}}<br>{{@|Andrea.Dotti|cern.ch}}
  
 
|-
 
|-
|[http://fusion.bifi.unizar.es/ fusion]  
+
|[http://www.gridpp.ac.uk gridpp]
|FUSION Virtual Organization was created to support the EGEE-II NA4 activity
+
|GridPP is a collaboration of particle physicists and computer scientists from the UK and CERN. They are building a distributed computing Grid across the UK for particle physicists. At the moment there is a working particle physics Grid across 17 UK institutions.
|Ruben Valles
+
|{{@|m.doidge|lancaster.ac.uk}}
  
 +
|-
 +
|[http://www.hyperk.org hyperk.org]
 +
|We propose the Hyper-Kamiokande (Hyper-K) detector as a next generation underground water Cherenkov detector. It will serve as a far detector of a long baseline neutrino oscillation experiment envisioned for the upgraded J-PARC, and as a detector capable of observing -- far beyond the sensitivity of the Super-Kamiokande (Super-K) detector -- proton decays, atmospheric neutrinos, and neutrinos from astronomical origins. The baseline design of Hyper-K is based on the highly successful Super-K, taking full advantage of a well-proven technology. "
 +
|{{@|C.J.Walker|qmul.ac.uk}}<br>{{@|francesca.di_lodovico|kcl.ac.uk}}
  
 
|-
 
|-
|[http://www-flc.desy.de/flc/ ilc]
+
|[http://www.icecube.wisc.edu/ icecube]
|International Linear Collider project (future electron-positron linear collider studies)
+
|The goal of the VO is to enable the usage of Grid resources for ICECUBE collaboration members, mainly for simulation and reconstruction.
|
+
|{{@|thomas.hartmann|desy.de}}<br>{{@|andreas.gellrich|desy.de}}<br>{{@|andreas.haupt|desy.de}}
  
 
|-
 
|-
|[https://icecube.wisc.edu/ icecube]
+
|[http://www-flc.desy.de/ ilc]
|Neutrino experiment at the South Pole (Astronomy, Astrophysics and Astro-Particle Physics)
+
|VO for the International Linear Collider Community.
|Damian Pieloth, Alessandra Forti
+
|{{@|thomas.hartmann|desy.de}}<br>{{@|andreas.gellrich|desy.de}}<br>{{@|Christoph.Wissing|desy.de}}
  
 
|-
 
|-
|[http://www-microboone.fnal.gov/ microboone]
+
|[https://voms2.cnaf.infn.it:8443/voms/ipv6.hepix.org/admin/home.action ipv6.hepix.org]
|Low energy neutrino (Fermilab)
+
|The goal of the VO is to carry out testing of IPv6 readiness, functionality and performance of the middleware, applications and tools required by the stakeholder communities, especially HEP. Other authorised activities include use of the testbed by related IPv6 activities inside EGI, the related middleware technology providers and other Infrastructures used by WLCG/HEP.
|
+
|{{@|david.kelsey|stfc.ac.uk}}
  
 
|-
 
|-
|[http://www-zeus.desy.de/ zeus]
+
|[http://lz.lbl.gov/ lz]
|Particle physics experiment on DESY's electron-proton collider (HERA)
+
|This VO will support LUX Zeplin experiment designed to search Dark Matter.
|
+
|{{@|E.Korolkova|sheffield.ac.uk}}<br>{{@|j.dobson|ucl.ac.uk}}
  
 
|-
 
|-
|[http://na62.web.cern.ch/NA62/ na62.vo.gridpp.ac.uk]
+
|[http://magic.mppmu.mpg.de magic]
|Another CP violation experiment at CERN
+
|MAGIC is a system of two imaging atmospheric Cherenkov telescopes (or IACTs). MAGIC-I started routine operation after commissioning in 2004. Construction of MAGIC-II was completed in early 2009, and the two telescopes have been in operation ever since, with a break in 2012 for an upgrade that achieved full homogeneity. The project is funded primarily by the funding agencies BMFB (Germany), MPG (Germany), INFN (Italy), MICINN(Spain), and the ETH Zurich (Switzerland).
|[http://www.ppe.gla.ac.uk/~protopop/ Dan Protopopescu] ([http://www.gla.ac.uk Uni. Glasgow])
+
|{{@|neissner|pic.es}}<br>{{@|contrera|gae.ucm.es}}<br>{{@|rfirpo|pic.es}}
  
 
|-
 
|-
|[https://w3.hepix.org/ipv6-bis/doku.php?id=ipv6:introduction ipv6.hepix.org]
+
|[http://www.magrid.ma vo.magrid.ma]
|Testing of IPv6 of the middleware, applications and tools (HEP, EGI, middleware technology providers and other infrastructures used by WLCG).
+
|VO vo.magrid.ma is a multidisciplinary VO providing general grid services and support to Moroccan scientific community
|Chris Walker, Dave Kelsey
+
|{{@|rahim|cnrst.ma}}
  
 
|-
 
|-
|[http://www.lsst.org/lsst/ lsst] [https://www.gridpp.ac.uk/wiki/LSST_UK LSST UK]
+
|[http://www.mice.iit.edu/ mice]
|Large Synoptic Survey Telescope
+
|A VO to support the activities of the Muon Ionisation Cooling Experiment (MICE). Specifically it is to enable the moving of MICE data around the Grid followed by the submission of analysis to these data. This is expected to be a small VO.
|Gabriele Garzoglio, Iain Goodenow
+
|{{@|d.colling|imperial.ac.uk}}<br>{{@|p.hodgson|sheffield.ac.uk}}<br>{{@|daniela.bauer|imperial.ac.uk}}<br>{{@|janusz.martyniak|imperial.ac.uk}}
  
 +
|-
 +
|[https://microboone.fnal.gov/ uboone]
 +
|MicroBooNE is a large 170-ton liquid-argon time projection chamber (LArTPC) neutrino experiment located on the Booster neutrino beamline at Fermilab
 +
|
  
 
|-
 
|-
|
+
|[https://www.psi.ch/en/mu3e mu3e]
|'''The VOs below are not synced from the CIC Portal data'''
+
|The Mu3e experiment is a new search for the lepton-flavour violating decay of a positive muon into two positrons and one electron.
 
|
 
|
  
 
|-
 
|-
|*
+
|[https://na62.gla.ac.uk/ na62.vo.gridpp.ac.uk]
|*
+
|The NA62 VO (na62.vo.gridpp.ac.uk) is meant to provide grid computing and data storage resources to the NA62 collaboration. The NA62 VO is supported by University of Cambridge, University of Glasgow, Imperial College London, University of Birmingham, University of Lancaster, University of Liverpool, University of Manchester, Oxford University and RAL (from UK), CERN, CNAF (Italy) and UCL (Belgium). More info about the NA62 experiment can be found on http://na62.web.cern.ch/na62/. The production portal is located at http://na62.gla.ac.uk/
|*
+
|{{@|Dan.Protopopescu|glasgow.ac.uk}}<br>{{@|David.Britton|glasgow.ac.uk}}
 
+
 
+
 
+
 
+
 
+
|}
+
 
+
== Approved Local VOs==
+
 
+
{|border="1" cellpadding="1"
+
|+Approved Local VOs
+
 
+
|-style="background:#7C8AAF;color:white"
+
!Name
+
!Area
+
!Contact
+
  
 
|-
 
|-
|[http://www.gridpp.ac.uk/ gridpp] ([https://voms.gridpp.ac.uk:8443/voms/gridpp/register/start.action join])
+
|[https://wiki.egi.eu/wiki/OPS_vo ops]
|GridPP is a collaboration of particle physicists and computer scientists from the UK and CERN
+
|The goal of the VO is to facilitate the operations of the LCG/EGI infrastructure, which includes running official monitoring, re-certification and performance evaluation tools. Additionally the VO will be used for interoperations with other grid infrastructures.
|Jeremy Coles (Cambridge)
+
|{{@|eimamagi|srce.hr}}<br>{{@|alessandro.paolini|egi.eu}}
  
 
|-
 
|-
 
|[http://www.phenogrid.dur.ac.uk/ pheno]
 
|[http://www.phenogrid.dur.ac.uk/ pheno]
|A collaboration of UK Particle Physics Phenomenologists who are developing applications for the LHC
+
|Phenogrid is the VO for UK theorists that don't fit within one of the LHC experiments (e.g. developers of Monte Carlos etc.) The rest of this text exists only to satisfy the extremely unnecessary minimum limit of 200 characters.
|David Grellscheid (Durham)
+
|{{@|jeppe.andersen|durham.ac.uk}}<br>{{@|adam.j.boutcher|durham.ac.uk}}<br>{{@|paul.clark|durham.ac.uk}}
  
 +
|-
 +
|[https://snoplus.phy.queensu.ca/ snoplus.snolab.ca]
 +
|VO for the snoplus experiment, a multi-purpose liquid scintillator neutrino experiment based in Sudbury, Canada. Members of the snoplus virtual organisation will contribute to the European computing effort to accurately simulate the SNOplus detector response.
 +
|{{@|Jeanne.wilson|kcl.ac.uk}}<br>{{@|C.J.Walker|qmul.ac.uk}}<br>{{@|m.mottram|qmul.ac.uk}}
  
 
|-
 
|-
|[http://mice.iit.edu  mice]
+
|[http://www.imperial.ac.uk/high-energy-physics/research/experiments/solid/ solidexperiment.org]
|A neutrino factory experiments
+
|support grid user of the SoLid experiment.
|Paul Hodgson (Sheffield)
+
|{{@|daniela.bauer|imperial.ac.uk}}<br>{{@|antonin.vacheret|imperial.ac.uk}}
  
 
|-
 
|-
|[https://voms.gridpp.ac.uk:8443/voms/camont  camont]
+
|[http://www.t2k.org t2k.org]
|Image Processing in High Energy Physics
+
|T2K is a neutrino experiment designed to investigate how neutrinos change from one flavour to another as they travel (neutrino oscillations). An intense beam of muon neutrinos is generated at the J-PARC nuclear physics site on the East coast of Japan and directed across the country to the Super-Kamiokande neutrino detector in the mountains of western Japan. The beam is measured once before it leaves the J-PARC site, using the near detector ND280, and again at Super-K: the change in the measured intensity and composition of the beam is used to provide information on the properties of neutrinos.
|Karl Harrison harrison@hep.phy.cam.ac.uk, Cambridge Ontology Ltd
+
|{{@|sophie.king|kcl.ac.uk}}<br>{{@|tomislav.vladisavljevic|stfc.ac.uk}}
 
+
 
+
  
 
|-
 
|-
|[http://snoplus.phy.queensu.ca/Home.html snoplus.snolab.ca]
+
|[http://wwwcascina.virgo.infn.it/ virgo]
| A  Diverse Instrument for Neutrino Research within the SNOLAB Underground facility
+
|Scientific target: detection of gravitational waves. Gravitational waves are predicted by the General Theory of Relativity but still not directly detected due to their extremely weak interaction with matter. Large interferometric detectors, like Virgo, are operating with the aim of directly detecting gravitational signals from various astrophysical sources. Signals are expected to be deeply buried into detector noise and suitable data analysis algorithm are developed in order to allow detection and signal parameter estimation. For many kind of searches large computing resources are needed and in some important cases we are computationally bound: the larger is the available computing power and the wider is the portion of source parameter space that can be explored.  
|Jeanne Wilson (QMU), [http://www.sussex.ac.uk/profiles/287073 Matt Mottram] (Uni. Sussex)
+
  
|-
+
VO target: to allow data management and computationally intensive data analysis
|[http://www.geog.leeds.ac.uk/projects/neiss/about.php neiss.org.uk]
+
|{{@|cristiano.palomba|roma1.infn.it}}<br>{{@|alberto.colla|roma1.infn.it}}
|National e-Infrastructure for Social Simulation
+
|Sam Skipsey (Glasgow), June Finch
+
  
 
|-
 
|-
|[http://operations-portal.egi.eu/vo/view/voname/vo.londongrid.ac.uk vo.londongrid.ac.uk] ([https://www.gridpp.ac.uk/wiki/Vo.londongrid.ac.uk homepage])
+
|[http://mossaic.org/ vo.landslides.mossaic.org]
|The regional VO for LondonGrid; provides access to members of the universities and academic institutes of London.
+
|A virtual organisation for landslide modellers associated with the Management of Slope Stability in Communities (MoSSaiC) project. The VO is used for running landslide modelling software such as CHASM and QUESTA.
|[mailto:daniela.bauer@imperial.ac.uk Daniela Bauer], [mailto:duncan.rand@imperial.ac.uk Duncan Rand]
+
|{{@|l.kreczko|bristol.ac.uk}}
  
 
|-
 
|-
|[http://operations-portal.egi.eu/vo/view/voname/vo.northgrid.ac.uk vo.northgrid.ac.uk] ([https://voms.gridpp.ac.uk:8443/voms/vo.northgrid.ac.uk homepage], [https://voms.gridpp.ac.uk:8443/voms/vo.northgrid.ac.uk/user/search.action?searchData.firstResult=0&searchData.maxResults=30 users])
+
|[http://moedal.org vo.moedal.org]
|Regional VO to allow access to HEP resources to different local disciplines.
+
|The MoEDAL VO allows members of the MoEDAL Collaboration to perform all of the computing activities relevant for the MoEDAL experiment, making use of available resources according to the policy defined by the Collaboration.
|[mailto:Alessandra.Forti@cern.ch Alessandra Forti]
+
|{{@|t.whyntie|qmul.ac.uk}}<br>{{@|daniel.felea|cern.ch}}
  
 
|-
 
|-
|[http://operations-portal.egi.eu/vo/view/voname/vo.scotgrid.ac.uk vo.scotgrid.ac.uk] ([http://www.scotgrid.ac.uk/ homepage])
+
|[https://voms.gridpp.ac.uk:8443/voms/vo.northgrid.ac.uk vo.northgrid.ac.uk]
|The VO is for academic and other users in the ScotGrid region to test access to EGI and GridPP resources.  
+
|Regional Virtual Organisation created to allow access to HEP resources to other local disciplines from Northgrid sites: Manchester, Lancaster, Liverpool, Sheffield. Users from these universities can apply.
|[mailto:david.crooks@glasgow.ac.uk David Crooks]
+
|{{@|alessandra.forti|cern.ch}}<br>{{@|robert.frank|manchester.ac.uk}}<br>{{@|robert.frank|manchester.ac.uk}}
  
 
|-
 
|-
|[http://operations-portal.egi.eu/vo/view/voname/vo.southgrid.ac.uk vo.southgrid.ac.uk] ([http://www.southgrid.ac.uk/VO homepage])
+
|[http://www.scotgrid.ac.uk/ vo.scotgrid.ac.uk]
|The VO is for academic and other users in the SouthGrid region to test access to EGI resources.  
+
|The VO is for academic and other users in Scotland to test access to EGI resources. Users will join this VO before deciding whether to setup one of their own for long term access. It is also designed as a test VO to allow maintenance and operational testing of site services.
|[mailto:P.Gronbech1@physics.ox.ac.uk Peter Gronbech]
+
|{{@|garth.roy|glasgow.ac.uk}}
  
 
|-
 
|-
|[http://www.sruc.ac.uk/epic/ epic.vo.gridpp.ac.uk] ([https://voms.gridpp.ac.uk:8443/voms/epic.vo.gridpp.ac.uk join])
+
|[http://www.southgrid.ac.uk/VO/ vo.southgrid.ac.uk]
|Veterinary epidemiology in Scotland
+
|The VO is for academic and other users in the SouthGrid (UKI-SOUTHGRID-BHAM-HEP,UKI-SOUTHGRID-BRIS-HEP,UKI-SOUTHGRID-CAM-HEP,UKI-SOUTHGRID-OX-HEP,UKI-SOUTHGRID-RALPP, UKI-SOUTHGRID-SUSX) region to test access to EGI resources. Users will join this VO before deciding
|Thomas Doherty
+
whether to setup one of their own for long term access.
 +
|{{@|pete.gronbech|physics.ox.ac.uk}}
  
 
|-
 
|-
|[http://www.hyperk.org hyperk.org] ([https://voms.gridpp.ac.uk:8443/voms/hyperk.org join])
+
|[http://www-zeus.desy.de/ zeus]
|The Hyper-Kamiokande experiment
+
|ZEUS is a collaboration of about 450 physicists who are running a large particle detector at the electron-proton collider HERA at the DESY laboratory in Hamburg. The ZEUS detector is a sophisticated tool for studying the particle reactions provided by the high-energetic beams of the HERA accelerator. Thus the participating scientists are pushing forward our knowledge of the fundamental particles and forces of nature, gaining unsurpassed insight into the exciting laws of the microcosm.
|Christopher Walker, Francesca di lodovico
+
|{{@|thomas.hartmann|desy.de}}<br>{{@|andreas.gellrich|desy.de}}
 +
<!-- end main approved list -->
 +
|}
  
 +
== IRIS Partners ==
  
 +
{|border="1" cellpadding="3" style="border-collapse:collapse;margin-bottom:40px;width:100%;"
 +
<!-- |+Other VOs -->
 +
|-style="background:#7C8AAF;color:white"
 +
!Name
 +
!Area
 +
!Contact
 +
<!-- Not recorded as active by QMUL
 
|-
 
|-
|[http://cernatschool.web.cern.ch cernatschool.org] ([https://voms.gridpp.ac.uk:8443/voms/cernatschool.org join])
+
| CCFE
|The [[CERN@school]] project.
+
|
|Steve Lloyd (QML), [http://pprc.qmul.ac.uk/directory/t.whyntie Tom Whyntie] (QML, Langton Star Centre)
+
|
 
|-
 
|-
 +
| CCP4
 
|
 
|
|'''The VOs below are not in the CIC Portal data'''
 
 
|
 
|
 +
|-
 +
| CASU
 +
|
 +
|
 +
|-
 +
| LIGO
 +
|
 +
|
 +
|-
 +
| Central Laser Facility
 +
|
 +
|
 +
|-
 +
| Gaia
 +
|
 +
|
 +
|-
 +
| ISIS
 +
|
 +
|
 +
|-
 +
| ALMA
 +
|
 +
|
 +
|-
 +
| MERLIN
 +
|
 +
|
 +
|-
 +
| EUCLID
 +
|
 +
|
 +
|-
 +
| diamond
 +
|
 +
|
 +
|-
 +
| LZ UK
 +
|
 +
|
 +
|-
 +
| WFAU
 +
|
 +
|
 +
-->
 +
<!-- start iris approved list -->
 +
|-
 +
|[https://atlas.cern/ atlas]
 +
|The ATLAS VO allow the members of the ATLAS collaboration to perform all the computing activities relevant for the ATLAS experiment, making use of the available resources following the policy defined by the Collaboration.
 +
|{{@|Alessandro.DeSalvo|roma1.infn.it}}<br>{{@|Elisabetta.Vilucchi|lnf.infn.it}}<br>{{@|jd|bnl.gov}}<br>{{@|james.william.walder|cern.ch}}
  
 
|-
 
|-
|earthsci.vo.gridpp.ac.uk
+
|[http://cms.cern.ch/iCMS/ cms]
|TBD
+
|The Compact Muon Solenoid (CMS) experiment is a large general-purpose particle physics detectors built on the proton-proton Large Hadron Collider (LHC) at CERN in Switzerland.
|TBD
+
|{{@|Andreas.Pfeiffer|cern.ch}}<br>{{@|stefano.belforte|cern.ch}}<br>{{@|stefano.belforte|ts.infn.it}}<br>{{@|Daniele.Bonacorsi|bo.infn.it}}<br>{{@|Christoph.Wissing|desy.de}}<br>{{@|sexton|gmail.com}}<br>{{@|lammel|fnal.gov}}<br>{{@|jose.hernandez|ciemat.es}}<br>{{@|Daniele.Bonacorsi|bo.infn.it}}<br>{{@|gutsche|fnal.gov}}<br>{{@|Andrea.Sciaba|cern.ch}}
  
 
|-
 
|-
|*
+
|[https://portal.cta-observatory.org/Pages/Home.aspx vo.cta.in2p3.fr]
|*
+
|Monte Carlo simulations production and analysis for the "CTA - Cherenkov Telescopes Array"
|*
+
international consortium.
 
+
|{{@|cecile.barbier|lapp.in2p3.fr}}<br>{{@|arrabito|in2p3.fr}}
|}
+
 
+
== Other VOs ==
+
 
+
This area can be used to record information about VOs that are site specific or localised in a region. This section can be used to advertise a local VO that you would like supported elsewhere.
+
 
+
{|border="1" cellpadding="1"
+
|+Other VOs
+
 
+
|-style="background:#7C8AAF;color:white"
+
!Name
+
!Area
+
!Contact
+
 
+
  
 +
|-
 +
|[http://www.dunescience.org dune]
 +
|DUNE is the Deep Underground Neutrino Experiment managed by the global DUNE collaboration and hosted at Fermilab.  We are building a deep-underground Liquid-Argon based neutrino detector to study accelerator-based neutrino oscillations, supernova neutrinos, and nucleon decay.
 +
|{{@|andrew.mcnab|cern.ch}}<br>{{@|timm|fnal.gov}}
  
 
|-
 
|-
|vo.landslides.mossaic.org
+
|[http://lhcb.web.cern.ch/lhcb/ lhcb]
|The landslides VO belongs to the Mossaic project (http://mossaic.org/).
+
|The LHCb (Large Hadron Collider Beauty) experiment is mainly set on finding the solution to the mistery of the matter-antimatter imbalance in the Universe.
|Luke Kreczko (L.Kreczko@bristol.ac.uk)
+
|{{@|andrew.mcnab|cern.ch}}<br>{{@|concezio.bozzi|cern.ch}}<br>{{@|christophe.denis.haen|cern.ch}}<br>{{@|jan.van.eldik|cern.ch}}<br>{{@|joel.closier|cern.ch}}<br>{{@|ben.couturier|cern.ch}}<br>{{@|joel.closier|cern.ch}}
  
 
|-
 
|-
|enmr.eu
+
|[http://www.lsst.org/lsst/ lsst]
|unk
+
|Large Synoptic Survey Telescope or LSST is a large aperture wide field survey telescope and 3200 Megapixel camera to image faint astronomical objects, rapidly scan the sky and observe probes for dark matter and dark enegy. LSST Data Management and Simulation jobs will run on OSG and EGI.
|unk
+
|{{@|boutigny|in2p3.fr}}<br>{{@|IGoodenow|lsst.org}}<br>{{@|fabio|in2p3.fr}}<br>{{@|yangw|SLAC.stanford.edu}}<br>{{@|kherner|fnal.gov}}
  
 
|-
 
|-
|none
+
|[https://www.skatelescope.org/the-ska-project/ skatelescope.eu]
|none
+
|The Square Kilometre Array (SKA) project is an international effort to build the world’s largest radio telescope, with eventually over a square kilometre (one million square metres) of collecting area. The scale of the SKA represents a huge leap forward in both engineering and research & development towards building and delivering a unique instrument, with the detailed design and preparation now well under way. As one of the largest scientific endeavours in history, the SKA will bring together a wealth of the world’s finest scientists, engineers and policy makers to bring the project to fruition.
|none
+
The vo skatelescope.eu is the vo supporting this project.
 +
|{{@|alessandra.forti|cern.ch}}<br>{{@|andrew.mcnab|cern.ch}}<br>{{@|rohini.joshi|manchester.ac.uk}}
  
 +
|-
 +
|[https://eucliduk.net/ eucliduk.net]
 +
|The Euclid mission aims at understanding why the expansion of the Universe is accelerating and what is the nature of the source responsible for this acceleration which physicists refer to as dark energy.
 +
|{{@|msh|roe.ac.uk}}
 +
<!-- end iris approved list -->
 
|}
 
|}
  
Line 307: Line 368:
 
As part of its commitment to various projects, the GridPP PMB has approved the establishment of the following VOs (your site can not yet support these but when the VO is setup and functioning we will let you know.)
 
As part of its commitment to various projects, the GridPP PMB has approved the establishment of the following VOs (your site can not yet support these but when the VO is setup and functioning we will let you know.)
  
{|border="1" cellpadding="1"
+
{|border="1" cellpadding="3" style="border-collapse:collapse;margin-bottom:40px;width:100%;"
|+VOs being established
+
<!-- |+VOs being established -->
 
+
 
|-style="background:#7C8AAF;color:white"
 
|-style="background:#7C8AAF;color:white"
 
!Name
 
!Name
 
!Area
 
!Area
 
!Contact
 
!Contact
 
+
<!-- start new approved list -->
|-
+
<!-- end new approved list -->
| LZ
+
| LZ Dark Matter Experiment
+
| Daniela Bauer, Elena Korolkova, Dan Bradley
+
 
+
|-
+
|supernemo.vo.eu-egee.org
+
|Searching for neutrino mass
+
|Ben Morgan, Jen Jensen
+
 
+
|-
+
|fermilab
+
|Umbrella VO for Fermilab
+
|Gabriele Garzoglio, Alessandra Forti
+
 
+
|-
+
|dune
+
|Deep Underground Neutrino Experiment
+
|Elena Korelkova
+
 
+
 
+
 
|}
 
|}
 
=== DUNE ===
 
 
Elena Korelkova provides this information for DUNE
 
 
DEFAULT_SE=$DPM_HOST
 
 
SW_DIR=/cvmfs/dune.opensciencegrid.org
 
DEFAULT_SE=$DPM_HOST
 
STORAGE_DIR=$CLASSIC_STORAGE_DIR/dune
 
 
VOMS_SERVERS="'vomss://voms2.fnal.gov:8443/voms/dune?/dune/'"
 
VOMSES="'dune voms1.fnal.gov 15042 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms1.fnal.gov dune' 'dune voms2.fnal.gov 15042 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms2.fnal.gov dune'"
 
VOMS_CA_DN="'/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' '/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' "
 
  
 
== VOs that have been removed from approved list ==
 
== VOs that have been removed from approved list ==
Line 356: Line 382:
 
The table below comprises a history of VOs that have been removed from the approved list for various reasons.
 
The table below comprises a history of VOs that have been removed from the approved list for various reasons.
  
{|border="1" cellpadding="1"
+
{|border="1" cellpadding="3" style="border-collapse:collapse;margin-bottom:40px;width:100%;"
|+VOs that have been removed
+
<!-- |+VOs that have been removed -->
  
 
|-style="background:#7C8AAF;color:white"
 
|-style="background:#7C8AAF;color:white"
Line 365: Line 391:
  
 
|-
 
|-
|[http://superb.infn.it/ superbvo.org]
+
|babar
|19 Jan 2016
+
|9 Oct 2013
|Discussed at Ops Meeting. Defunct.
+
|none
  
 
|-
 
|-
|[http://www-h1.desy.de/ hone]
+
|camont
|24 Nov 2015
+
|7th June 2017
|Discussed at Ops Meeting. Defunct.
+
|none
  
 
|-
 
|-
|[http://sixtrack.web.cern.ch/SixTrack/ vo.sixt.cern.ch]
+
|camont.gridpp.ac.uk
|11 Nov 2015
+
|No members, no voms servers, defunct
+
 
+
|-
+
|babar
+
 
|9 Oct 2013
 
|9 Oct 2013
 
|none
 
|none
  
 
|-
 
|-
|camont.gridpp.ac.uk
+
|cdf
|9 Oct 2013
+
|7th June 2017
|The simply named "camont" is still approved. This was a duplicate/error.
+
|none
  
 
|-
 
|-
Line 393: Line 414:
 
|9 Oct 2013
 
|9 Oct 2013
 
|none
 
|none
 +
 +
|-
 +
|dzero
 +
|7th June 2017
 +
|none
 +
 +
|-
 +
|[https://voms.egi.cesga.es:8443/voms/fusion/register/start.action fusion]
 +
|30 Jan 2017
 +
|Discussion with Rubén Vallés Pérez. VO appears defunct.
 +
 +
|-
 +
|hone
 +
|24 Nov 2015
 +
|Discussed at Ops Meeting. Defunct.
  
 
|-
 
|-
Line 408: Line 444:
 
|9 Oct 2013
 
|9 Oct 2013
 
|none
 
|none
 +
 +
|-
 +
|neiss
 +
|7th June 2017
 +
|none
 +
  
 
|-
 
|-
Line 413: Line 455:
 
|9 Oct 2013
 
|9 Oct 2013
 
|none
 
|none
 +
 +
|-
 +
|superbvo.org
 +
|19 Jan 2016
 +
|Discussed at Ops Meeting. Defunct.
 +
 +
|-
 +
|supernemo.vo.eu-egee.org
 +
|24 Feb 2020
 +
|now called supernemo.org
  
 
|-
 
|-
Line 418: Line 470:
 
|9 Oct 2013
 
|9 Oct 2013
 
|none
 
|none
 +
 +
|-
 +
|vo.londongrid.ac.uk
 +
|in progress [[https://ggus.eu/?mode=ticket_info&ticket_id=129065 GGUS]]
 +
|VO not used any more
 +
 +
|-
 +
|vo.sixt.cern.ch
 +
|11 Nov 2015
 +
|No members, no voms servers, defunct
  
 
|}
 
|}
Line 425: Line 487:
 
The examples of site-info.def entries for yaim have been moved: [[ExampleSiteinfoDefEntries|Example site-info.def entries]]
 
The examples of site-info.def entries for yaim have been moved: [[ExampleSiteinfoDefEntries|Example site-info.def entries]]
  
== VO Yaim Records ==
+
<div style="margin:auto; border:2px solid black;background-color:#EEEEEE;width:600px; max-width:97%">
 +
<div style="font-size:1.2em; font-weight:bold; padding-left:4px;background-color:#7C8AAF;color:#fff;">Please Note</div>
 +
<div style="padding:3px 6px">
 +
Please do not change the '''vomsdir/''' or '''vomses/''' entries below, as they are automatically updated from the EGI Operations Portal.
 +
Any changes you make will be lost!
 +
</div>
 +
</div>
  
This section presents the VO records for each approved VO, extracted from the Operations Portal.
 
  
'''Note about SIDs versus VODs:'''  
+
<!-- START OF SIDSECTION -->{{BOX VO|ALICE|<!-- VOMS RECORDS for ALICE -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/alice/lcg-voms2.cern.ch.lsc
 +
<pre><nowiki>
 +
/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 +
</nowiki></pre>
  
YAIM processes VO records that are collectively bunched in the site-info.def. Such records have the following names.
+
''' Filename: ''' /etc/grid-security/vomsdir/alice/voms2.cern.ch.lsc
 +
<pre><nowiki>
 +
/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 +
</nowiki></pre>
  
VO_<UCVONAME>_VOMS_SERVERS, VO_<UCVONAME>_VOMSES and VO_<UCVONAME>_VOMS_CA_DN.
+
''' Filename: ''' /etc/grid-security/vomsdir/alice/voms-alice-auth.app.cern.ch.lsc
 +
<pre><nowiki>
 +
/DC=ch/DC=cern/OU=computers/CN=alice-auth.web.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 +
</nowiki></pre>
  
Where UCVONAME is the VO name in uppercase, with underscores instead of dots. I call this style of record "SID format".
+
''' Filename: ''' /etc/vomses/alice-lcg-voms2.cern.ch
 +
<pre><nowiki>
 +
"alice" "lcg-voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "alice"
 +
</nowiki></pre>
  
Alternatively, YAIM can process records in files in the /root/glitecfg/vo.d directory. The name of each file is the VO name. Hence, no UCVONAME is needed. I call this style "VOD format". VOD format is recommended.
+
''' Filename: ''' /etc/vomses/alice-voms2.cern.ch
 +
<pre><nowiki>
 +
"alice" "voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "alice"
 +
</nowiki></pre>
  
 +
''' Filename: ''' /etc/vomses/alice-voms-alice-auth.app.cern.ch
 +
<pre><nowiki>
 +
"alice" "voms-alice-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=alice-auth.web.cern.ch" "alice"
 +
</nowiki></pre>
  
'''Tip about DNS style VO names:'''
+
Notes:
 +
n/a
 +
}}
  
One can represent any VO using VOD records, but some VOs (i.e. DNS style names; those with dots in the name) are
 
awkward to express in SID format. In the tables  below,  SID records for such VOs are  given but they are
 
commented out with hash signs. In these cases, it might be best to use VOD format records.
 
  
'''Note about record multiplicity:'''
 
  
Data in the CIC portal for the VO_<UCVONAME>_VOMS_SERVERS, VO_<UCVONAME>_VOMSES and VO_<UCVONAME>_VOMS_CA_DN records
+
{{BOX VO|ATLAS|<!-- VOMS RECORDS for ATLAS -->
is related by order and multiplicity - they match up.  
+
''' Filename: ''' /etc/grid-security/vomsdir/atlas/lcg-voms2.cern.ch.lsc
 +
<pre><nowiki>
 +
/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 +
</nowiki></pre>
  
When generating the Yaim versions of this data, records for VO_<UCVONAME>_VOMSES and VO_<UCVONAME>_VOMS_CA_DN
+
''' Filename: ''' /etc/grid-security/vomsdir/atlas/voms2.cern.ch.lsc
must match in order and multiplicity.
+
<pre><nowiki>
 +
/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 +
</nowiki></pre>
  
But it is optional with respect to the  VO_<UCVONAME>_VOMS_SERVERS record. According to Maarten Litmaath "for the CERN
+
''' Filename: ''' /etc/grid-security/vomsdir/atlas/voms-atlas-auth.app.cern.ch.lsc
servers it actually was deemed desirable that grid-mapfiles be generated using voms.cern.ch only, because
+
<pre><nowiki>
lcg-voms.cern.ch is already running the VOMRS (sic) service as an extra load". 
+
/DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 +
</nowiki></pre>
  
Thus, in the sections below, VO_<UCVONAME>_VOMS_SERVERS for CERN based VOs are restricted to a single record related to
+
''' Filename: ''' /etc/vomses/atlas-lcg-voms2.cern.ch
voms.cern.ch.
+
<pre><nowiki>
 +
"atlas" "lcg-voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "atlas"
 +
</nowiki></pre>
  
'''NOTA BENE'''
+
''' Filename: ''' /etc/vomses/atlas-voms2.cern.ch
 +
<pre><nowiki>
 +
"atlas" "voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "atlas"
 +
</nowiki></pre>
  
Please do not change the '''VO Yaim Records''' tables below, as they are automatically updated from the CIC Portal.  
+
''' Filename: ''' /etc/vomses/atlas-voms-atlas-auth.app.cern.ch
 +
<pre><nowiki>
 +
"atlas" "voms-atlas-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch" "atlas"
 +
</nowiki></pre>
  
<!-- START OF SIDSECTION -->
+
Notes:
 +
n/a
 +
}}
  
  
{{BOX VO|ALICE|<!-- VOMS RECORDS for ALICE -->
 
  
 
+
{{BOX VO|BES|<!-- VOMS RECORDS for BES -->
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/bes/voms.ihep.ac.cn.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_ALICE_VOMS_SERVERS="'vomss://lcg-voms2.cern.ch:8443/voms/alice?/alice' 'vomss://voms2.cern.ch:8443/voms/alice?/alice' "
+
/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn
VO_ALICE_VOMSES="'alice lcg-voms2.cern.ch 15000 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch alice' 'alice voms2.cern.ch 15000 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch alice' "
+
/C=CN/O=HEP/CN=Institute of High Energy Physics Certification Authority
VO_ALICE_VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
 
<pre><nowiki>
 
# $YAIM_LOCATION/vo.d/alice
 
VOMS_SERVERS="'vomss://lcg-voms2.cern.ch:8443/voms/alice?/alice' 'vomss://voms2.cern.ch:8443/voms/alice?/alice' "
 
VOMSES="'alice lcg-voms2.cern.ch 15000 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch alice' 'alice voms2.cern.ch 15000 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch alice' "
 
VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
 
  
 +
''' Filename: ''' /etc/vomses/bes-voms.ihep.ac.cn
 +
<pre><nowiki>
 +
"bes" "voms.ihep.ac.cn" "15001" "/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn" "bes"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|ATLAS|<!-- VOMS RECORDS for ATLAS -->
 
  
 
+
{{BOX VO|BIOMED|<!-- VOMS RECORDS for BIOMED -->
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/biomed/cclcgvomsli01.in2p3.fr.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_ATLAS_VOMS_SERVERS="'vomss://lcg-voms2.cern.ch:8443/voms/atlas?/atlas' 'vomss://voms2.cern.ch:8443/voms/atlas?/atlas' "
+
/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr
VO_ATLAS_VOMSES="'atlas lcg-voms2.cern.ch 15001 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch atlas' 'atlas voms2.cern.ch 15001 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch atlas' "
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
VO_ATLAS_VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
 
<pre><nowiki>
 
# $YAIM_LOCATION/vo.d/atlas
 
VOMS_SERVERS="'vomss://lcg-voms2.cern.ch:8443/voms/atlas?/atlas' 'vomss://voms2.cern.ch:8443/voms/atlas?/atlas' "
 
VOMSES="'atlas lcg-voms2.cern.ch 15001 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch atlas' 'atlas voms2.cern.ch 15001 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch atlas' "
 
VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
 
  
 +
''' Filename: ''' /etc/vomses/biomed-cclcgvomsli01.in2p3.fr
 +
<pre><nowiki>
 +
"biomed" "cclcgvomsli01.in2p3.fr" "15000" "/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr" "biomed"
 
</nowiki></pre>
 
</nowiki></pre>
 +
 
Notes:
 
Notes:
 
n/a
 
n/a
Line 513: Line 612:
  
  
{{BOX VO|BIOMED|<!-- VOMS RECORDS for BIOMED -->
+
{{BOX VO|CALICE|<!-- VOMS RECORDS for CALICE -->
 
+
''' Filename: ''' /etc/grid-security/vomsdir/calice/grid-voms.desy.de.lsc
 +
<pre><nowiki>
 +
/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
 +
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/vomses/calice-grid-voms.desy.de
 
<pre><nowiki>
 
<pre><nowiki>
VO_BIOMED_VOMS_SERVERS="'vomss://cclcgvomsli01.in2p3.fr:8443/voms/biomed?/biomed' "
+
"calice" "grid-voms.desy.de" "15102" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "calice"
VO_BIOMED_VOMSES="'biomed cclcgvomsli01.in2p3.fr 15000 /O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr biomed' "
+
VO_BIOMED_VOMS_CA_DN="'/C=FR/O=CNRS/CN=GRID2-FR' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
Notes:
 +
n/a
 +
}}
 +
 
 +
 
 +
 
 +
{{BOX VO|CEPC|<!-- VOMS RECORDS for CEPC -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/cepc/voms.ihep.ac.cn.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/biomed
+
/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn
VOMS_SERVERS="'vomss://cclcgvomsli01.in2p3.fr:8443/voms/biomed?/biomed' "
+
/C=CN/O=HEP/CN=Institute of High Energy Physics Certification Authority
VOMSES="'biomed cclcgvomsli01.in2p3.fr 15000 /O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr biomed' "
+
</nowiki></pre>
VOMS_CA_DN="'/C=FR/O=CNRS/CN=GRID2-FR' "
+
  
 +
''' Filename: ''' /etc/vomses/cepc-voms.ihep.ac.cn
 +
<pre><nowiki>
 +
"cepc" "voms.ihep.ac.cn" "15005" "/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn" "cepc"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|CAMONT|<!-- VOMS RECORDS for CAMONT -->
 
  
 +
{{BOX VO|CERNATSCHOOL.ORG|<!-- VOMS RECORDS for CERNATSCHOOL.ORG -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/cernatschool.org/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/cernatschool.org/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_CAMONT_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/camont?/camont' "
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
VO_CAMONT_VOMSES="'camont voms.gridpp.ac.uk 15025 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk camont' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VO_CAMONT_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/cernatschool.org/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/camont
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/camont?/camont' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VOMSES="'camont voms.gridpp.ac.uk 15025 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk camont' "
+
</nowiki></pre>
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
  
 +
''' Filename: ''' /etc/vomses/cernatschool.org-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"cernatschool.org" "voms.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "cernatschool.org"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/cernatschool.org-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"cernatschool.org" "voms02.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "cernatschool.org"
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/cernatschool.org-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"cernatschool.org" "voms03.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "cernatschool.org"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|CDF|<!-- VOMS RECORDS for CDF -->
 
  
 +
{{BOX VO|CLAS12|<!-- VOMS RECORDS for CLAS12 -->
 +
Notes:
 +
n/a
 +
}}
  
''' site-info.def version (sid) '''
+
 
 +
 
 +
{{BOX VO|CMS|<!-- VOMS RECORDS for CMS -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/cms/lcg-voms2.cern.ch.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_CDF_VOMS_SERVERS="'vomss://voms.cnaf.infn.it:8443/voms/cdf?/cdf' "
+
/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
VO_CDF_VOMSES="'cdf voms-01.pd.infn.it 15001 /C=IT/O=INFN/OU=Host/L=Padova/CN=voms-01.pd.infn.it cdf' 'cdf voms.cnaf.infn.it 15001 /C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it cdf' 'cdf voms.fnal.gov 15020 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms1.fnal.gov cdf' "
+
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
VO_CDF_VOMS_CA_DN="'/C=IT/O=INFN/CN=INFN Certification Authority' '/C=IT/O=INFN/CN=INFN Certification Authority' '/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/cms/voms2.cern.ch.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/cdf
+
/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
VOMS_SERVERS="'vomss://voms.cnaf.infn.it:8443/voms/cdf?/cdf' "
+
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
VOMSES="'cdf voms-01.pd.infn.it 15001 /C=IT/O=INFN/OU=Host/L=Padova/CN=voms-01.pd.infn.it cdf' 'cdf voms.cnaf.infn.it 15001 /C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it cdf' 'cdf voms.fnal.gov 15020 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms1.fnal.gov cdf' "
+
</nowiki></pre>
VOMS_CA_DN="'/C=IT/O=INFN/CN=INFN Certification Authority' '/C=IT/O=INFN/CN=INFN Certification Authority' '/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' "
+
  
 +
''' Filename: ''' /etc/grid-security/vomsdir/cms/voms-cms-auth.app.cern.ch.lsc
 +
<pre><nowiki>
 +
/DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 
</nowiki></pre>
 
</nowiki></pre>
Notes:
 
voms.fnal.gov is only an admin interface. It should not be configured on the machines because it cannot give proxies.
 
  
 +
''' Filename: ''' /etc/vomses/cms-lcg-voms2.cern.ch
 +
<pre><nowiki>
 +
"cms" "lcg-voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "cms"
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/cms-voms2.cern.ch
 +
<pre><nowiki>
 +
"cms" "voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "cms"
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/cms-voms-cms-auth.app.cern.ch
 +
<pre><nowiki>
 +
"cms" "voms-cms-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch" "cms"
 +
</nowiki></pre>
 +
 +
Notes:
 +
n/a
 
}}
 
}}
  
  
{{BOX VO|CMS|<!-- VOMS RECORDS for CMS -->
 
  
 +
{{BOX VO|COMET.J-PARC.JP|<!-- VOMS RECORDS for COMET.J-PARC.JP -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/comet.j-parc.jp/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/comet.j-parc.jp/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_CMS_VOMS_SERVERS="'vomss://lcg-voms2.cern.ch:8443/voms/cms?/cms' 'vomss://voms2.cern.ch:8443/voms/cms?/cms' "
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
VO_CMS_VOMSES="'cms lcg-voms2.cern.ch 15002 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch cms' 'cms voms2.cern.ch 15002 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch cms' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VO_CMS_VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/comet.j-parc.jp/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/cms
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
VOMS_SERVERS="'vomss://lcg-voms2.cern.ch:8443/voms/cms?/cms' 'vomss://voms2.cern.ch:8443/voms/cms?/cms' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VOMSES="'cms lcg-voms2.cern.ch 15002 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch cms' 'cms voms2.cern.ch 15002 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch cms' "
+
</nowiki></pre>
VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
+
  
 +
''' Filename: ''' /etc/vomses/comet.j-parc.jp-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"comet.j-parc.jp" "voms.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "comet.j-parc.jp"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/comet.j-parc.jp-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"comet.j-parc.jp" "voms02.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "comet.j-parc.jp"
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/comet.j-parc.jp-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"comet.j-parc.jp" "voms03.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "comet.j-parc.jp"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
 
{{BOX VO|DTEAM|<!-- VOMS RECORDS for DTEAM -->
 
  
  
''' site-info.def version (sid) '''
+
{{BOX VO|DTEAM|<!-- VOMS RECORDS for DTEAM -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/dteam/voms2.hellasgrid.gr.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_DTEAM_VOMS_SERVERS="'vomss://voms.hellasgrid.gr:8443/voms/dteam?/dteam' "
+
/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr
VO_DTEAM_VOMSES="'dteam voms.hellasgrid.gr 15004 /C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms.hellasgrid.gr dteam' 'dteam voms2.hellasgrid.gr 15004 /C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr dteam' "
+
/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016
VO_DTEAM_VOMS_CA_DN="'/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016' '/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
 
<pre><nowiki>
 
# $YAIM_LOCATION/vo.d/dteam
 
VOMS_SERVERS="'vomss://voms.hellasgrid.gr:8443/voms/dteam?/dteam' "
 
VOMSES="'dteam voms.hellasgrid.gr 15004 /C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms.hellasgrid.gr dteam' 'dteam voms2.hellasgrid.gr 15004 /C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr dteam' "
 
VOMS_CA_DN="'/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016' '/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016' "
 
  
 +
''' Filename: ''' /etc/vomses/dteam-voms2.hellasgrid.gr
 +
<pre><nowiki>
 +
"dteam" "voms2.hellasgrid.gr" "15004" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "dteam"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|DZERO|<!-- VOMS RECORDS for DZERO -->
 
  
 +
{{BOX VO|DUNE|<!-- VOMS RECORDS for DUNE -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/dune/voms1.fnal.gov.lsc
 +
<pre><nowiki>
 +
/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov
 +
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/dune/voms2.fnal.gov.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_DZERO_VOMS_SERVERS="'vomss://voms.fnal.gov:8443/voms/dzero?/dzero' "
+
/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov
VO_DZERO_VOMSES="'dzero voms.fnal.gov 15002 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms1.fnal.gov dzero' "
+
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3
VO_DZERO_VOMS_CA_DN="'/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/dune-voms1.fnal.gov
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/dzero
+
"dune" "voms1.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov" "dune"
VOMS_SERVERS="'vomss://voms.fnal.gov:8443/voms/dzero?/dzero' "
+
</nowiki></pre>
VOMSES="'dzero voms.fnal.gov 15002 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms1.fnal.gov dzero' "
+
VOMS_CA_DN="'/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' "
+
  
 +
''' Filename: ''' /etc/vomses/dune-voms2.fnal.gov
 +
<pre><nowiki>
 +
"dune" "voms2.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov" "dune"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
voms.fnal.gov is only an admin interface. It should not be configured on the machines because it cannot give proxies.
+
Notes:
 +
n/a
 
}}
 
}}
  
 
{{BOX VO|ESR|<!-- VOMS RECORDS for ESR -->
 
  
  
''' site-info.def version (sid) '''
+
{{BOX VO|ENMR.EU|<!-- VOMS RECORDS for ENMR.EU -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/enmr.eu/voms2.cnaf.infn.it.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_ESR_VOMS_SERVERS="'vomss://voms.grid.sara.nl:8443/voms/esr?/esr' "
+
/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it
VO_ESR_VOMSES="'esr voms.grid.sara.nl 30001 /O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl esr' "
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
VO_ESR_VOMS_CA_DN="'/C=NL/O=NIKHEF/CN=NIKHEF medium-security certification auth' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
 
<pre><nowiki>
 
# $YAIM_LOCATION/vo.d/esr
 
VOMS_SERVERS="'vomss://voms.grid.sara.nl:8443/voms/esr?/esr' "
 
VOMSES="'esr voms.grid.sara.nl 30001 /O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl esr' "
 
VOMS_CA_DN="'/C=NL/O=NIKHEF/CN=NIKHEF medium-security certification auth' "
 
  
 +
''' Filename: ''' /etc/vomses/enmr.eu-voms2.cnaf.infn.it
 +
<pre><nowiki>
 +
"enmr.eu" "voms2.cnaf.infn.it" "15014" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it" "enmr.eu"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
 
{{BOX VO|EPIC.VO.GRIDPP.AC.UK|<!-- VOMS RECORDS for EPIC.VO.GRIDPP.AC.UK -->
 
{{BOX VO|EPIC.VO.GRIDPP.AC.UK|<!-- VOMS RECORDS for EPIC.VO.GRIDPP.AC.UK -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
 +
''' Filename: ''' /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# VO_EPIC_VO_GRIDPP_AC_UK_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/epic.vo.gridpp.ac.uk?/epic.vo.gridpp.ac.uk' "
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
# VO_EPIC_VO_GRIDPP_AC_UK_VOMSES="'epic.vo.gridpp.ac.uk voms.gridpp.ac.uk 15507 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk epic.vo.gridpp.ac.uk' 'epic.vo.gridpp.ac.uk voms02.gridpp.ac.uk 15027 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk epic.vo.gridpp.ac.uk' 'epic.vo.gridpp.ac.uk voms03.gridpp.ac.uk 15027 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk epic.vo.gridpp.ac.uk' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
# VO_EPIC_VO_GRIDPP_AC_UK_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/epic.vo.gridpp.ac.uk-voms.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/epic.vo.gridpp.ac.uk
+
"epic.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15507" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/epic.vo.gridpp.ac.uk?/epic.vo.gridpp.ac.uk' "
+
</nowiki></pre>
VOMSES="'epic.vo.gridpp.ac.uk voms.gridpp.ac.uk 15507 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk epic.vo.gridpp.ac.uk' 'epic.vo.gridpp.ac.uk voms02.gridpp.ac.uk 15027 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk epic.vo.gridpp.ac.uk' 'epic.vo.gridpp.ac.uk voms03.gridpp.ac.uk 15027 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk epic.vo.gridpp.ac.uk' "
+
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
  
 +
''' Filename: ''' /etc/vomses/epic.vo.gridpp.ac.uk-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"epic.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/epic.vo.gridpp.ac.uk-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"epic.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
{{BOX VO|FERMILAB|<!-- VOMS RECORDS for FERMILAB -->
 
  
  
''' site-info.def version (sid) '''
+
{{BOX VO|ESR|<!-- VOMS RECORDS for ESR -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/esr/voms.grid.sara.nl.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_FERMILAB_VOMS_SERVERS="'vomss://voms1.fnal.gov:8443/voms/fermilab?/fermilab' 'vomss://voms2.fnal.gov:8443/voms/fermilab?/fermilab' "
+
/O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl
VO_FERMILAB_VOMSES="'fermilab voms1.fnal.gov 15001 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms1.fnal.gov fermilab' 'fermilab voms2.fnal.gov 15001 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms2.fnal.gov fermilab' "
+
/C=NL/O=NIKHEF/CN=NIKHEF medium-security certification auth
VO_FERMILAB_VOMS_CA_DN="'/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' '/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
 
<pre><nowiki>
 
# $YAIM_LOCATION/vo.d/fermilab
 
VOMS_SERVERS="'vomss://voms1.fnal.gov:8443/voms/fermilab?/fermilab' 'vomss://voms2.fnal.gov:8443/voms/fermilab?/fermilab' "
 
VOMSES="'fermilab voms1.fnal.gov 15001 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms1.fnal.gov fermilab' 'fermilab voms2.fnal.gov 15001 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms2.fnal.gov fermilab' "
 
VOMS_CA_DN="'/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' '/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' "
 
  
 +
''' Filename: ''' /etc/vomses/esr-voms.grid.sara.nl
 +
<pre><nowiki>
 +
"esr" "voms.grid.sara.nl" "30001" "/O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl" "esr"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
{{BOX VO|GEANT4|<!-- VOMS RECORDS for GEANT4 -->
 
  
  
''' site-info.def version (sid) '''
+
{{BOX VO|EUCLIDUK.NET|<!-- VOMS RECORDS for EUCLIDUK.NET -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/eucliduk.net/voms.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_GEANT4_VOMS_SERVERS="'vomss://voms2.cern.ch:8443/voms/geant4?/geant4' "
+
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
VO_GEANT4_VOMSES="'geant4 lcg-voms2.cern.ch 15007 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch geant4' 'geant4 voms2.cern.ch 15007 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch geant4' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VO_GEANT4_VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
 
<pre><nowiki>
 
# $YAIM_LOCATION/vo.d/geant4
 
VOMS_SERVERS="'vomss://voms2.cern.ch:8443/voms/geant4?/geant4' "
 
VOMSES="'geant4 lcg-voms2.cern.ch 15007 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch geant4' 'geant4 voms2.cern.ch 15007 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch geant4' "
 
VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
 
  
 +
''' Filename: ''' /etc/vomses/eucliduk.net-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"eucliduk.net" "voms.gridpp.ac.uk" "15518" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "eucliduk.net"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|GRIDPP|<!-- VOMS RECORDS for GRIDPP -->
 
  
 +
{{BOX VO|FERMILAB|<!-- VOMS RECORDS for FERMILAB -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/fermilab/voms1.fnal.gov.lsc
 +
<pre><nowiki>
 +
/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov
 +
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/fermilab/voms2.fnal.gov.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_GRIDPP_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/gridpp?/gridpp' "
+
/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov
VO_GRIDPP_VOMSES="'gridpp voms.gridpp.ac.uk 15000 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk gridpp' 'gridpp voms02.gridpp.ac.uk 15000 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk gridpp' 'gridpp voms03.gridpp.ac.uk 15000 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk gridpp' "
+
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3
VO_GRIDPP_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/fermilab-voms1.fnal.gov
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/gridpp
+
"fermilab" "voms1.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov" "fermilab"
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/gridpp?/gridpp' "
+
</nowiki></pre>
VOMSES="'gridpp voms.gridpp.ac.uk 15000 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk gridpp' 'gridpp voms02.gridpp.ac.uk 15000 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk gridpp' 'gridpp voms03.gridpp.ac.uk 15000 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk gridpp' "
+
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
  
 +
''' Filename: ''' /etc/vomses/fermilab-voms2.fnal.gov
 +
<pre><nowiki>
 +
"fermilab" "voms2.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov" "fermilab"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
Line 755: Line 945:
  
  
 +
{{BOX VO|GEANT4|<!-- VOMS RECORDS for GEANT4 -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/geant4/lcg-voms2.cern.ch.lsc
 +
<pre><nowiki>
 +
/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 +
</nowiki></pre>
  
{{BOX VO|HYPERK.ORG|<!-- VOMS RECORDS for HYPERK.ORG -->
+
''' Filename: ''' /etc/grid-security/vomsdir/geant4/voms2.cern.ch.lsc
 
+
 
+
''' site-info.def version (sid) '''
+
 
<pre><nowiki>
 
<pre><nowiki>
# VO_HYPERK_ORG_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/hyperk.org?/hyperk.org' "
+
/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
# VO_HYPERK_ORG_VOMSES="'hyperk.org voms.gridpp.ac.uk 15510 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk hyperk.org' 'hyperk.org voms02.gridpp.ac.uk 15510 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk hyperk.org' 'hyperk.org voms03.gridpp.ac.uk 15510 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk hyperk.org' "
+
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
# VO_HYPERK_ORG_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/geant4-lcg-voms2.cern.ch
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/hyperk.org
+
"geant4" "lcg-voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "geant4"
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/hyperk.org?/hyperk.org' "
+
</nowiki></pre>
VOMSES="'hyperk.org voms.gridpp.ac.uk 15510 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk hyperk.org' 'hyperk.org voms02.gridpp.ac.uk 15510 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk hyperk.org' 'hyperk.org voms03.gridpp.ac.uk 15510 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk hyperk.org' "
+
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
  
 +
''' Filename: ''' /etc/vomses/geant4-voms2.cern.ch
 +
<pre><nowiki>
 +
"geant4" "voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "geant4"
 
</nowiki></pre>
 
</nowiki></pre>
 +
 
Notes:
 
Notes:
 
n/a
 
n/a
Line 778: Line 973:
  
  
{{BOX VO|ICECUBE|<!-- VOMS RECORDS for ICECUBE -->
 
  
 +
{{BOX VO|GRIDPP|<!-- VOMS RECORDS for GRIDPP -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/gridpp/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/gridpp/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_ICECUBE_VOMS_SERVERS="'vomss://grid-voms.desy.de:8443/voms/icecube?/icecube' "
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
VO_ICECUBE_VOMSES="'icecube grid-voms.desy.de 15106 /C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de icecube' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VO_ICECUBE_VOMS_CA_DN="'/C=DE/O=GermanGrid/CN=GridKa-CA' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/gridpp/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/icecube
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
VOMS_SERVERS="'vomss://grid-voms.desy.de:8443/voms/icecube?/icecube' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VOMSES="'icecube grid-voms.desy.de 15106 /C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de icecube' "
+
</nowiki></pre>
VOMS_CA_DN="'/C=DE/O=GermanGrid/CN=GridKa-CA' "
+
  
 +
''' Filename: ''' /etc/vomses/gridpp-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"gridpp" "voms.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "gridpp"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/gridpp-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"gridpp" "voms02.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "gridpp"
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/gridpp-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"gridpp" "voms03.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "gridpp"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|ILC|<!-- VOMS RECORDS for ILC -->
 
  
 +
{{BOX VO|HYPERK.ORG|<!-- VOMS RECORDS for HYPERK.ORG -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/hyperk.org/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/hyperk.org/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_ILC_VOMS_SERVERS="'vomss://grid-voms.desy.de:8443/voms/ilc?/ilc' "
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
VO_ILC_VOMSES="'ilc grid-voms.desy.de 15110 /C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de ilc' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VO_ILC_VOMS_CA_DN="'/C=DE/O=GermanGrid/CN=GridKa-CA' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/hyperk.org/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/ilc
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
VOMS_SERVERS="'vomss://grid-voms.desy.de:8443/voms/ilc?/ilc' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VOMSES="'ilc grid-voms.desy.de 15110 /C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de ilc' "
+
</nowiki></pre>
VOMS_CA_DN="'/C=DE/O=GermanGrid/CN=GridKa-CA' "
+
  
 +
''' Filename: ''' /etc/vomses/hyperk.org-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"hyperk.org" "voms.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "hyperk.org"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/hyperk.org-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"hyperk.org" "voms02.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "hyperk.org"
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/hyperk.org-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"hyperk.org" "voms03.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "hyperk.org"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|IPV6.HEPIX.ORG|<!-- VOMS RECORDS for IPV6.HEPIX.ORG -->
 
  
 
+
{{BOX VO|ICECUBE|<!-- VOMS RECORDS for ICECUBE -->
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/icecube/grid-voms.desy.de.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# VO_IPV6_HEPIX_ORG_VOMS_SERVERS="'vomss://voms2.cnaf.infn.it:8443/voms/ipv6.hepix.org?/ipv6.hepix.org' "
+
/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
# VO_IPV6_HEPIX_ORG_VOMSES="'ipv6.hepix.org voms2.cnaf.infn.it 15013 /C=IT/O=INFN/OU=Host/L=CNAF/CN=voms2.cnaf.infn.it ipv6.hepix.org' "
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
# VO_IPV6_HEPIX_ORG_VOMS_CA_DN="'/C=IT/O=INFN/CN=INFN Certification Authority' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
 
<pre><nowiki>
 
# $YAIM_LOCATION/vo.d/ipv6.hepix.org
 
VOMS_SERVERS="'vomss://voms2.cnaf.infn.it:8443/voms/ipv6.hepix.org?/ipv6.hepix.org' "
 
VOMSES="'ipv6.hepix.org voms2.cnaf.infn.it 15013 /C=IT/O=INFN/OU=Host/L=CNAF/CN=voms2.cnaf.infn.it ipv6.hepix.org' "
 
VOMS_CA_DN="'/C=IT/O=INFN/CN=INFN Certification Authority' "
 
  
 +
''' Filename: ''' /etc/vomses/icecube-grid-voms.desy.de
 +
<pre><nowiki>
 +
"icecube" "grid-voms.desy.de" "15106" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "icecube"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|LHCB|<!-- VOMS RECORDS for LHCB -->
 
  
 
+
{{BOX VO|ILC|<!-- VOMS RECORDS for ILC -->
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/ilc/grid-voms.desy.de.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_LHCB_VOMS_SERVERS="'vomss://lcg-voms2.cern.ch:8443/voms/lhcb?/lhcb' 'vomss://voms2.cern.ch:8443/voms/lhcb?/lhcb' "
+
/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
VO_LHCB_VOMSES="'lhcb lcg-voms2.cern.ch 15003 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch lhcb' 'lhcb voms2.cern.ch 15003 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch lhcb' "
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
VO_LHCB_VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/ilc-grid-voms.desy.de
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/lhcb
+
"ilc" "grid-voms.desy.de" "15110" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "ilc"
VOMS_SERVERS="'vomss://lcg-voms2.cern.ch:8443/voms/lhcb?/lhcb' 'vomss://voms2.cern.ch:8443/voms/lhcb?/lhcb' "
+
</nowiki></pre>
VOMSES="'lhcb lcg-voms2.cern.ch 15003 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch lhcb' 'lhcb voms2.cern.ch 15003 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch lhcb' "
+
VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
+
  
</nowiki></pre>
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|LSST|<!-- VOMS RECORDS for LSST -->
 
  
 
+
{{BOX VO|IPV6.HEPIX.ORG|<!-- VOMS RECORDS for IPV6.HEPIX.ORG -->
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/ipv6.hepix.org/voms2.cnaf.infn.it.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_LSST_VOMS_SERVERS="'vomss://voms.fnal.gov:8443/voms/lsst?/lsst' 'vomss://voms2.fnal.gov:8443/voms/lsst?/lsst' "
+
/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it
VO_LSST_VOMSES="'lsst voms.fnal.gov 15003 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms1.fnal.gov lsst' 'lsst voms1.fnal.gov 15003 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms1.fnal.gov lsst' 'lsst voms2.fnal.gov 15003 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms2.fnal.gov lsst' "
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
VO_LSST_VOMS_CA_DN="'/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' '/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' '/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/ipv6.hepix.org-voms2.cnaf.infn.it
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/lsst
+
"ipv6.hepix.org" "voms2.cnaf.infn.it" "15013" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it" "ipv6.hepix.org"
VOMS_SERVERS="'vomss://voms.fnal.gov:8443/voms/lsst?/lsst' 'vomss://voms2.fnal.gov:8443/voms/lsst?/lsst' "
+
VOMSES="'lsst voms.fnal.gov 15003 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms1.fnal.gov lsst' 'lsst voms1.fnal.gov 15003 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms1.fnal.gov lsst' 'lsst voms2.fnal.gov 15003 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms2.fnal.gov lsst' "
+
VOMS_CA_DN="'/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' '/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' '/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' "
+
 
+
 
</nowiki></pre>
 
</nowiki></pre>
Notes: voms.fnal.gov is only an admin interface. It should not be configured on the machines because it cannot give proxies.
 
 
Sites supporting lsst are advised to read  GGUS 117587.
 
 
(former advice - now out of date? - was "It would not do any harm to have it on service nodes but should not be installed on any UI.")
 
  
 +
Notes:
 +
n/a
 
}}
 
}}
  
  
{{BOX VO|LZ|<!-- VOMS RECORDS for LZ -->
 
  
 +
{{BOX VO|LHCB|<!-- VOMS RECORDS for LHCB -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/lhcb/lcg-voms2.cern.ch.lsc
 +
<pre><nowiki>
 +
/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/lhcb/voms2.cern.ch.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_LZ_VOMS_SERVERS="'vomss://voms.hep.wisc.edu:8443/voms/lz?/lz' "
+
/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
VO_LZ_VOMSES="'lz voms.hep.wisc.edu 15001 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms.hep.wisc.edu lz' "
+
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
VO_LZ_VOMS_CA_DN="'/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/lhcb-lcg-voms2.cern.ch
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/lz
+
"lhcb" "lcg-voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "lhcb"
VOMS_SERVERS="'vomss://voms.hep.wisc.edu:8443/voms/lz?/lz' "
+
</nowiki></pre>
VOMSES="'lz voms.hep.wisc.edu 15001 /DC=org/DC=opensciencegrid/O=Open Science Grid/OU=Services/CN=voms.hep.wisc.edu lz' "
+
VOMS_CA_DN="'/DC=org/DC=cilogon/C=US/O=CILogon/CN=CILogon OSG CA 1' "
+
  
 +
''' Filename: ''' /etc/vomses/lhcb-voms2.cern.ch
 +
<pre><nowiki>
 +
"lhcb" "voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "lhcb"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:
 
  
Daniela Bauer and Simon Fayer provide this information for LZ
+
Notes:
 +
n/a
 +
}}
  
  Site Specific Settings:
 
    VO_LZ_SW_DIR&#61;/cvmfs/lz.opensciencegrid.org
 
    VO_LZ_DEFAULT_SE&#61;gfe02.grid.hep.ph.ic.ac.uk
 
  
  UI only:
 
    WMS_HOSTS&#61;"wms01.grid.hep.ph.ic.ac.uk"
 
  
  Must be set explicitly for the WMS, true elsewhere:
+
{{BOX VO|LSST|<!-- VOMS RECORDS for LSST -->
    MAP_WILDCARDS&#61;yes
+
''' Filename: ''' /etc/grid-security/vomsdir/lsst/voms.slac.stanford.edu.lsc
 
+
<pre><nowiki>
Simon points to this wrt cvmfs:
+
/DC=org/DC=incommon/C=US/ST=California/O=Stanford University/CN=voms.slac.stanford.edu
  https://github.com/ATLASConnect/PortableCVMFS/tree/master/conf
+
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3
 +
</nowiki></pre>
  
 +
''' Filename: ''' /etc/vomses/lsst-voms.slac.stanford.edu
 +
<pre><nowiki>
 +
"lsst" "voms.slac.stanford.edu" "15003" "/DC=org/DC=incommon/C=US/ST=California/O=Stanford University/CN=voms.slac.stanford.edu" "lsst"
 +
</nowiki></pre>
  
 +
Notes:
 +
n/a
 
}}
 
}}
  
{{BOX VO|MAGIC|<!-- VOMS RECORDS for MAGIC -->
 
  
  
''' site-info.def version (sid) '''
+
{{BOX VO|LZ|<!-- VOMS RECORDS for LZ -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/lz/voms.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_MAGIC_VOMS_SERVERS="'vomss://voms01.pic.es:8443/voms/magic?/magic' "
+
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
VO_MAGIC_VOMSES="'magic voms01.pic.es 15003 /DC=org/DC=terena/DC=tcs/C=ES/ST=Barcelona/L=Bellaterra/O=Port dInformacio Cientifica/CN=voms01.pic.es magic' 'magic voms02.pic.es 15003 /DC=org/DC=terena/DC=tcs/C=ES/ST=Barcelona/L=Bellaterra/O=Port dInformacio Cientifica/CN=voms02.pic.es magic' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VO_MAGIC_VOMS_CA_DN="'/C=NL/ST=Noord-Holland/L=Amsterdam/O=TERENA/CN=TERENA eScience SSL CA 3' '/C=NL/ST=Noord-Holland/L=Amsterdam/O=TERENA/CN=TERENA eScience SSL CA 3' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/lz/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/magic
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
VOMS_SERVERS="'vomss://voms01.pic.es:8443/voms/magic?/magic' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VOMSES="'magic voms01.pic.es 15003 /DC=org/DC=terena/DC=tcs/C=ES/ST=Barcelona/L=Bellaterra/O=Port dInformacio Cientifica/CN=voms01.pic.es magic' 'magic voms02.pic.es 15003 /DC=org/DC=terena/DC=tcs/C=ES/ST=Barcelona/L=Bellaterra/O=Port dInformacio Cientifica/CN=voms02.pic.es magic' "
+
</nowiki></pre>
VOMS_CA_DN="'/C=NL/ST=Noord-Holland/L=Amsterdam/O=TERENA/CN=TERENA eScience SSL CA 3' '/C=NL/ST=Noord-Holland/L=Amsterdam/O=TERENA/CN=TERENA eScience SSL CA 3' "
+
  
 +
''' Filename: ''' /etc/vomses/lz-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"lz" "voms.gridpp.ac.uk" "15517" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "lz"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/lz-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"lz" "voms02.gridpp.ac.uk" "15517" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "lz"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
 +
 +
 +
 +
{{BOX VO|MAGIC|<!-- VOMS RECORDS for MAGIC -->
 +
Notes:
 +
n/a
 +
}}
 +
  
  
 
{{BOX VO|MICE|<!-- VOMS RECORDS for MICE -->
 
{{BOX VO|MICE|<!-- VOMS RECORDS for MICE -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/mice/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
 +
''' Filename: ''' /etc/grid-security/vomsdir/mice/voms02.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/mice/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_MICE_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/mice?/mice' "
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
VO_MICE_VOMSES="'mice voms.gridpp.ac.uk 15001 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk mice' 'mice voms02.gridpp.ac.uk 15001 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk mice' 'mice voms03.gridpp.ac.uk 15001 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk mice' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VO_MICE_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/mice-voms.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/mice
+
"mice" "voms.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "mice"
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/mice?/mice' "
+
</nowiki></pre>
VOMSES="'mice voms.gridpp.ac.uk 15001 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk mice' 'mice voms02.gridpp.ac.uk 15001 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk mice' 'mice voms03.gridpp.ac.uk 15001 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk mice' "
+
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
  
 +
''' Filename: ''' /etc/vomses/mice-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"mice" "voms02.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "mice"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:
 
n/a
 
  
 +
''' Filename: ''' /etc/vomses/mice-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"mice" "voms03.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "mice"
 +
</nowiki></pre>
 +
 +
Notes:
 +
n/a
 
}}
 
}}
  
  
{{BOX VO|NA62.VO.GRIDPP.AC.UK|<!-- VOMS RECORDS for NA62.VO.GRIDPP.AC.UK -->
 
  
 
+
{{BOX VO|MU3E|<!-- VOMS RECORDS for MU3E -->
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/mu3e/voms.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# VO_NA62_VO_GRIDPP_AC_UK_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/na62.vo.gridpp.ac.uk?/na62.vo.gridpp.ac.uk' 'vomss://voms02.gridpp.ac.uk:8443/voms/na62.vo.gridpp.ac.uk?/na62.vo.gridpp.ac.uk' 'vomss://voms03.gridpp.ac.uk:8443/voms/na62.vo.gridpp.ac.uk?/na62.vo.gridpp.ac.uk' "
+
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
# VO_NA62_VO_GRIDPP_AC_UK_VOMSES="'na62.vo.gridpp.ac.uk voms.gridpp.ac.uk 15501 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk na62.vo.gridpp.ac.uk' 'na62.vo.gridpp.ac.uk voms02.gridpp.ac.uk 15501 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk na62.vo.gridpp.ac.uk' 'na62.vo.gridpp.ac.uk voms03.gridpp.ac.uk 15501 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk na62.vo.gridpp.ac.uk' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
# VO_NA62_VO_GRIDPP_AC_UK_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
 
<pre><nowiki>
 
# $YAIM_LOCATION/vo.d/na62.vo.gridpp.ac.uk
 
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/na62.vo.gridpp.ac.uk?/na62.vo.gridpp.ac.uk' 'vomss://voms02.gridpp.ac.uk:8443/voms/na62.vo.gridpp.ac.uk?/na62.vo.gridpp.ac.uk' 'vomss://voms03.gridpp.ac.uk:8443/voms/na62.vo.gridpp.ac.uk?/na62.vo.gridpp.ac.uk' "
 
VOMSES="'na62.vo.gridpp.ac.uk voms.gridpp.ac.uk 15501 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk na62.vo.gridpp.ac.uk' 'na62.vo.gridpp.ac.uk voms02.gridpp.ac.uk 15501 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk na62.vo.gridpp.ac.uk' 'na62.vo.gridpp.ac.uk voms03.gridpp.ac.uk 15501 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk na62.vo.gridpp.ac.uk' "
 
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
 
  
 +
''' Filename: ''' /etc/vomses/mu3e-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"mu3e" "voms.gridpp.ac.uk" "15516" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "mu3e"
 
</nowiki></pre>
 
</nowiki></pre>
 +
 
Notes:
 
Notes:
 
n/a
 
n/a
Line 996: Line 1,248:
  
  
{{BOX VO|NEISS.ORG.UK|<!-- VOMS RECORDS for NEISS.ORG.UK -->
 
  
 +
{{BOX VO|NA62.VO.GRIDPP.AC.UK|<!-- VOMS RECORDS for NA62.VO.GRIDPP.AC.UK -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# VO_NEISS_ORG_UK_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/neiss.org.uk?/neiss.org.uk' "
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
# VO_NEISS_ORG_UK_VOMSES="'neiss.org.uk voms.gridpp.ac.uk 15027 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk neiss.org.uk' 'neiss.org.uk voms02.gridpp.ac.uk 15027 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk neiss.org.uk' 'neiss.org.uk voms03.gridpp.ac.uk 15027 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk neiss.org.uk' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
# VO_NEISS_ORG_UK_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/neiss.org.uk
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/neiss.org.uk?/neiss.org.uk' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VOMSES="'neiss.org.uk voms.gridpp.ac.uk 15027 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk neiss.org.uk' 'neiss.org.uk voms02.gridpp.ac.uk 15027 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk neiss.org.uk' 'neiss.org.uk voms03.gridpp.ac.uk 15027 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk neiss.org.uk' "
+
</nowiki></pre>
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
  
 +
''' Filename: ''' /etc/vomses/na62.vo.gridpp.ac.uk-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"na62.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/na62.vo.gridpp.ac.uk-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"na62.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/na62.vo.gridpp.ac.uk-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"na62.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
 
{{BOX VO|OPS|<!-- VOMS RECORDS for OPS -->
 
{{BOX VO|OPS|<!-- VOMS RECORDS for OPS -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/ops/lcg-voms2.cern.ch.lsc
 +
<pre><nowiki>
 +
/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 +
</nowiki></pre>
  
 
+
''' Filename: ''' /etc/grid-security/vomsdir/ops/voms2.cern.ch.lsc
''' site-info.def version (sid) '''
+
 
<pre><nowiki>
 
<pre><nowiki>
VO_OPS_VOMS_SERVERS="'vomss://lcg-voms2.cern.ch:8443/voms/ops?/ops' 'vomss://voms2.cern.ch:8443/voms/ops?/ops' "
+
/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
VO_OPS_VOMSES="'ops lcg-voms2.cern.ch 15009 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch ops' 'ops voms2.cern.ch 15009 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch ops' "
+
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
VO_OPS_VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/ops-lcg-voms2.cern.ch
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/ops
+
"ops" "lcg-voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "ops"
VOMS_SERVERS="'vomss://lcg-voms2.cern.ch:8443/voms/ops?/ops' 'vomss://voms2.cern.ch:8443/voms/ops?/ops' "
+
</nowiki></pre>
VOMSES="'ops lcg-voms2.cern.ch 15009 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch ops' 'ops voms2.cern.ch 15009 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch ops' "
+
VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
+
  
 +
''' Filename: ''' /etc/vomses/ops-voms2.cern.ch
 +
<pre><nowiki>
 +
"ops" "voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "ops"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
 
{{BOX VO|PHENO|<!-- VOMS RECORDS for PHENO -->
 
{{BOX VO|PHENO|<!-- VOMS RECORDS for PHENO -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/pheno/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
 +
''' Filename: ''' /etc/grid-security/vomsdir/pheno/voms02.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/pheno/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_PHENO_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/pheno?/pheno' "
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
VO_PHENO_VOMSES="'pheno voms.gridpp.ac.uk 15011 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk pheno' 'pheno voms02.gridpp.ac.uk 15011 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk pheno' 'pheno voms03.gridpp.ac.uk 15011 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk pheno' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VO_PHENO_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/pheno-voms.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/pheno
+
"pheno" "voms.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "pheno"
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/pheno?/pheno' "
+
</nowiki></pre>
VOMSES="'pheno voms.gridpp.ac.uk 15011 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk pheno' 'pheno voms02.gridpp.ac.uk 15011 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk pheno' 'pheno voms03.gridpp.ac.uk 15011 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk pheno' "
+
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
  
 +
''' Filename: ''' /etc/vomses/pheno-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"pheno" "voms02.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "pheno"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/pheno-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"pheno" "voms03.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "pheno"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|PLANCK|<!-- VOMS RECORDS for PLANCK -->
 
  
 +
{{BOX VO|SKATELESCOPE.EU|<!-- VOMS RECORDS for SKATELESCOPE.EU -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/skatelescope.eu/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/skatelescope.eu/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_PLANCK_VOMS_SERVERS="'vomss://voms.cnaf.infn.it:8443/voms/planck?/planck' "
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
VO_PLANCK_VOMSES="'planck voms.cnaf.infn.it 15002 /C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it planck' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VO_PLANCK_VOMS_CA_DN="'/C=IT/O=INFN/CN=INFN Certification Authority' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/skatelescope.eu/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/planck
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
VOMS_SERVERS="'vomss://voms.cnaf.infn.it:8443/voms/planck?/planck' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VOMSES="'planck voms.cnaf.infn.it 15002 /C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it planck' "
+
</nowiki></pre>
VOMS_CA_DN="'/C=IT/O=INFN/CN=INFN Certification Authority' "
+
  
 +
''' Filename: ''' /etc/vomses/skatelescope.eu-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"skatelescope.eu" "voms.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "skatelescope.eu"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/skatelescope.eu-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"skatelescope.eu" "voms02.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "skatelescope.eu"
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/skatelescope.eu-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"skatelescope.eu" "voms03.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "skatelescope.eu"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
 
{{BOX VO|SNOPLUS.SNOLAB.CA|<!-- VOMS RECORDS for SNOPLUS.SNOLAB.CA -->
 
{{BOX VO|SNOPLUS.SNOLAB.CA|<!-- VOMS RECORDS for SNOPLUS.SNOLAB.CA -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/snoplus.snolab.ca/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
 +
''' Filename: ''' /etc/grid-security/vomsdir/snoplus.snolab.ca/voms02.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/snoplus.snolab.ca/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# VO_SNOPLUS_SNOLAB_CA_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/snoplus.snolab.ca?/snoplus.snolab.ca' "
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
# VO_SNOPLUS_SNOLAB_CA_VOMSES="'snoplus.snolab.ca voms.gridpp.ac.uk 15503 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk snoplus.snolab.ca' 'snoplus.snolab.ca voms02.gridpp.ac.uk 15503 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk snoplus.snolab.ca' 'snoplus.snolab.ca voms03.gridpp.ac.uk 15503 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk snoplus.snolab.ca' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
# VO_SNOPLUS_SNOLAB_CA_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/snoplus.snolab.ca-voms.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/snoplus.snolab.ca
+
"snoplus.snolab.ca" "voms.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "snoplus.snolab.ca"
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/snoplus.snolab.ca?/snoplus.snolab.ca' "
+
</nowiki></pre>
VOMSES="'snoplus.snolab.ca voms.gridpp.ac.uk 15503 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk snoplus.snolab.ca' 'snoplus.snolab.ca voms02.gridpp.ac.uk 15503 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk snoplus.snolab.ca' 'snoplus.snolab.ca voms03.gridpp.ac.uk 15503 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk snoplus.snolab.ca' "
+
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
  
 +
''' Filename: ''' /etc/vomses/snoplus.snolab.ca-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"snoplus.snolab.ca" "voms02.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "snoplus.snolab.ca"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/snoplus.snolab.ca-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"snoplus.snolab.ca" "voms03.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "snoplus.snolab.ca"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|SUPERNEMO.VO.EU-EGEE.ORG|<!-- VOMS RECORDS for SUPERNEMO.VO.EU-EGEE.ORG -->
 
  
 +
{{BOX VO|SOLIDEXPERIMENT.ORG|<!-- VOMS RECORDS for SOLIDEXPERIMENT.ORG -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/solidexperiment.org/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/solidexperiment.org/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# VO_SUPERNEMO_VO_EU_EGEE_ORG_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/supernemo.vo.eu-egee.org?/supernemo.vo.eu-egee.org' "
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
# VO_SUPERNEMO_VO_EU_EGEE_ORG_VOMSES="'supernemo.vo.eu-egee.org voms.gridpp.ac.uk 15012 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk supernemo.vo.eu-egee.org' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
# VO_SUPERNEMO_VO_EU_EGEE_ORG_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/solidexperiment.org/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/supernemo.vo.eu-egee.org
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/supernemo.vo.eu-egee.org?/supernemo.vo.eu-egee.org' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VOMSES="'supernemo.vo.eu-egee.org voms.gridpp.ac.uk 15012 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk supernemo.vo.eu-egee.org' "
+
</nowiki></pre>
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
  
 +
''' Filename: ''' /etc/vomses/solidexperiment.org-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"solidexperiment.org" "voms.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "solidexperiment.org"
 
</nowiki></pre>
 
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/solidexperiment.org-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"solidexperiment.org" "voms02.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "solidexperiment.org"
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/solidexperiment.org-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"solidexperiment.org" "voms03.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "solidexperiment.org"
 +
</nowiki></pre>
 +
 
Notes:
 
Notes:
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
 
{{BOX VO|T2K.ORG|<!-- VOMS RECORDS for T2K.ORG -->
 
{{BOX VO|T2K.ORG|<!-- VOMS RECORDS for T2K.ORG -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/t2k.org/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
 +
''' Filename: ''' /etc/grid-security/vomsdir/t2k.org/voms02.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/t2k.org/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# VO_T2K_ORG_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/t2k.org?/t2k.org' "
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
# VO_T2K_ORG_VOMSES="'t2k.org voms.gridpp.ac.uk 15003 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk t2k.org' 't2k.org voms02.gridpp.ac.uk 15003 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk t2k.org' 't2k.org voms03.gridpp.ac.uk 15003 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk t2k.org' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
# VO_T2K_ORG_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/t2k.org-voms.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/t2k.org
+
"t2k.org" "voms.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "t2k.org"
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/t2k.org?/t2k.org' "
+
</nowiki></pre>
VOMSES="'t2k.org voms.gridpp.ac.uk 15003 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk t2k.org' 't2k.org voms02.gridpp.ac.uk 15003 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk t2k.org' 't2k.org voms03.gridpp.ac.uk 15003 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk t2k.org' "
+
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
  
 +
''' Filename: ''' /etc/vomses/t2k.org-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"t2k.org" "voms02.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "t2k.org"
 
</nowiki></pre>
 
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/t2k.org-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"t2k.org" "voms03.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "t2k.org"
 +
</nowiki></pre>
 +
 
Notes:
 
Notes:
 
n/a
 
n/a
Line 1,150: Line 1,517:
  
  
{{BOX VO|ZEUS|<!-- VOMS RECORDS for ZEUS -->
+
 
 +
{{BOX VO|UBOONE|<!-- VOMS RECORDS for UBOONE -->
 +
Notes:
 +
n/a
 +
}}
 +
 
  
  
''' site-info.def version (sid) '''
+
{{BOX VO|VIRGO|<!-- VOMS RECORDS for VIRGO -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/virgo/voms.cnaf.infn.it.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_ZEUS_VOMS_SERVERS="'vomss://grid-voms.desy.de:8443/voms/zeus?/zeus' "
+
/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms.cnaf.infn.it
VO_ZEUS_VOMSES="'zeus grid-voms.desy.de 15112 /C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de zeus' "
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
VO_ZEUS_VOMS_CA_DN="'/C=DE/O=GermanGrid/CN=GridKa-CA' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
 
<pre><nowiki>
 
# $YAIM_LOCATION/vo.d/zeus
 
VOMS_SERVERS="'vomss://grid-voms.desy.de:8443/voms/zeus?/zeus' "
 
VOMSES="'zeus grid-voms.desy.de 15112 /C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de zeus' "
 
VOMS_CA_DN="'/C=DE/O=GermanGrid/CN=GridKa-CA' "
 
  
 +
''' Filename: ''' /etc/vomses/virgo-voms.cnaf.infn.it
 +
<pre><nowiki>
 +
"virgo" "voms.cnaf.infn.it" "15009" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms.cnaf.infn.it" "virgo"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
N/A
+
Notes:
 +
n/a
 
}}
 
}}
  
  
{{BOX VO|CALICE  |<!-- VOMS RECORDS for CALICE -->
 
  
 
+
{{BOX VO|VO.COMPLEX-SYSTEMS.EU|<!-- VOMS RECORDS for VO.COMPLEX-SYSTEMS.EU -->
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/vo.complex-systems.eu/voms2.hellasgrid.gr.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_CALICE_VOMS_SERVERS="'vomss://grid-voms.desy.de:8443/voms/calice?/calice' "
+
/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr
VO_CALICE_VOMSES="'calice grid-voms.desy.de 15102 /C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de calice' "
+
/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016
VO_CALICE_VOMS_CA_DN="'/C=DE/O=GermanGrid/CN=GridKa-CA' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
 
<pre><nowiki>
 
# $YAIM_LOCATION/vo.d/calice
 
VOMS_SERVERS="'vomss://grid-voms.desy.de:8443/voms/calice?/calice' "
 
VOMSES="'calice grid-voms.desy.de 15102 /C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de calice' "
 
VOMS_CA_DN="'/C=DE/O=GermanGrid/CN=GridKa-CA' "
 
  
 +
''' Filename: ''' /etc/vomses/vo.complex-systems.eu-voms2.hellasgrid.gr
 +
<pre><nowiki>
 +
"vo.complex-systems.eu" "voms2.hellasgrid.gr" "15160" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "vo.complex-systems.eu"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|FUSION |<!-- VOMS RECORDS for FUSION -->
 
  
 
+
{{BOX VO|VO.CTA.IN2P3.FR|<!-- VOMS RECORDS for VO.CTA.IN2P3.FR -->
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/vo.cta.in2p3.fr/cclcgvomsli01.in2p3.fr.lsc
 
<pre><nowiki>
 
<pre><nowiki>
VO_FUSION_VOMS_SERVERS="'vomss://voms-prg.bifi.unizar.es:8443/voms/fusion?/fusion' "
+
/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr
VO_FUSION_VOMSES="'fusion voms-prg.bifi.unizar.es 15001 /DC=es/DC=irisgrid/O=bifi-unizar/CN=voms-prg.bifi.unizar.es fusion' "
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
VO_FUSION_VOMS_CA_DN="'/DC=es/DC=irisgrid/CN=IRISGridCA' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
 
<pre><nowiki>
 
# $YAIM_LOCATION/vo.d/fusion
 
VOMS_SERVERS="'vomss://voms-prg.bifi.unizar.es:8443/voms/fusion?/fusion' "
 
VOMSES="'fusion voms-prg.bifi.unizar.es 15001 /DC=es/DC=irisgrid/O=bifi-unizar/CN=voms-prg.bifi.unizar.es fusion' "
 
VOMS_CA_DN="'/DC=es/DC=irisgrid/CN=IRISGridCA' "
 
  
 +
''' Filename: ''' /etc/vomses/vo.cta.in2p3.fr-cclcgvomsli01.in2p3.fr
 +
<pre><nowiki>
 +
"vo.cta.in2p3.fr" "cclcgvomsli01.in2p3.fr" "15008" "/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr" "vo.cta.in2p3.fr"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:
 
Some spare fields are provided from the original version of this document.
 
<pre>
 
GRIDMAP_AUTH should include "ldap://swevo.ific.uv.es/ou=users,o=registrar,dc=swe,dc=lcg,dc=org"
 
FUSION VOMS server certificate http://swevo.ific.uv.es/vo/files/swevo.ific.uv.es.pem
 
 
Web site http://grid.bifi.unizar.es/egee/fusion-vo/
 
</pre>
 
  
 +
Notes:
 +
n/a
 
}}
 
}}
  
  
{{BOX VO|VO.LONDONGRID.AC.UK |<!-- VOMS RECORDS for VO.LONDONGRID.AC.UK -->
 
  
 +
{{BOX VO|VO.LANDSLIDES.MOSSAIC.ORG|<!-- VOMS RECORDS for VO.LANDSLIDES.MOSSAIC.ORG -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# VO_VO_LONDONGRID_AC_UK_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/vo.londongrid.ac.uk?/vo.londongrid.ac.uk' "
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
# VO_VO_LONDONGRID_AC_UK_VOMSES="'vo.londongrid.ac.uk voms.gridpp.ac.uk 15021 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk vo.londongrid.ac.uk' 'vo.londongrid.ac.uk voms02.gridpp.ac.uk 15021 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk vo.londongrid.ac.uk' 'vo.londongrid.ac.uk voms03.gridpp.ac.uk 15021 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk vo.londongrid.ac.uk' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
# VO_VO_LONDONGRID_AC_UK_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/vo.londongrid.ac.uk
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/vo.londongrid.ac.uk?/vo.londongrid.ac.uk' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VOMSES="'vo.londongrid.ac.uk voms.gridpp.ac.uk 15021 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk vo.londongrid.ac.uk' 'vo.londongrid.ac.uk voms02.gridpp.ac.uk 15021 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk vo.londongrid.ac.uk' 'vo.londongrid.ac.uk voms03.gridpp.ac.uk 15021 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk vo.londongrid.ac.uk' "
+
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
+
 
</nowiki></pre>
 
</nowiki></pre>
Notes:
 
n/a
 
}}
 
 
{{BOX VO|VO.MOEDAL.ORG |<!-- VOMS RECORDS for VO.MOEDAL.ORG -->
 
 
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/vomses/vo.landslides.mossaic.org-voms.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
# VO_VO_MOEDAL_ORG_VOMS_SERVERS="'vomss://lcg-voms2.cern.ch:8443/voms/vo.moedal.org?/vo.moedal.org' 'vomss://voms2.cern.ch:8443/voms/vo.moedal.org?/vo.moedal.org' "
+
"vo.landslides.mossaic.org" "voms.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.landslides.mossaic.org"
# VO_VO_MOEDAL_ORG_VOMSES="'vo.moedal.org lcg-voms2.cern.ch 15017 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch vo.moedal.org' 'vo.moedal.org voms2.cern.ch 15017 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch vo.moedal.org' "
+
# VO_VO_MOEDAL_ORG_VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/vo.landslides.mossaic.org-voms02.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/vo.moedal.org
+
"vo.landslides.mossaic.org" "voms02.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.landslides.mossaic.org"
VOMS_SERVERS="'vomss://lcg-voms2.cern.ch:8443/voms/vo.moedal.org?/vo.moedal.org' 'vomss://voms2.cern.ch:8443/voms/vo.moedal.org?/vo.moedal.org' "
+
</nowiki></pre>
VOMSES="'vo.moedal.org lcg-voms2.cern.ch 15017 /DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch vo.moedal.org' 'vo.moedal.org voms2.cern.ch 15017 /DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch vo.moedal.org' "
+
VOMS_CA_DN="'/DC=ch/DC=cern/CN=CERN Grid Certification Authority' '/DC=ch/DC=cern/CN=CERN Grid Certification Authority' "
+
  
 +
''' Filename: ''' /etc/vomses/vo.landslides.mossaic.org-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"vo.landslides.mossaic.org" "voms03.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.landslides.mossaic.org"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
{{BOX VO|VO.NORTHGRID.AC.UK |<!-- VOMS RECORDS for VO.NORTHGRID.AC.UK -->
 
  
  
''' site-info.def version (sid) '''
+
{{BOX VO|VO.MAGRID.MA|<!-- VOMS RECORDS for VO.MAGRID.MA -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.magrid.ma/voms.magrid.ma.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# VO_VO_NORTHGRID_AC_UK_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/vo.northgrid.ac.uk?/vo.northgrid.ac.uk' "
+
/C=MA/O=MaGrid/OU=CNRST/CN=voms.magrid.ma
# VO_VO_NORTHGRID_AC_UK_VOMSES="'vo.northgrid.ac.uk voms.gridpp.ac.uk 15018 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk vo.northgrid.ac.uk' 'vo.northgrid.ac.uk voms02.gridpp.ac.uk 15018 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk vo.northgrid.ac.uk' 'vo.northgrid.ac.uk voms03.gridpp.ac.uk 15018 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk vo.northgrid.ac.uk' "
+
/C=MA/O=MaGrid/CN=MaGrid CA
# VO_VO_NORTHGRID_AC_UK_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
 
<pre><nowiki>
 
# $YAIM_LOCATION/vo.d/vo.northgrid.ac.uk
 
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/vo.northgrid.ac.uk?/vo.northgrid.ac.uk' "
 
VOMSES="'vo.northgrid.ac.uk voms.gridpp.ac.uk 15018 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk vo.northgrid.ac.uk' 'vo.northgrid.ac.uk voms02.gridpp.ac.uk 15018 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk vo.northgrid.ac.uk' 'vo.northgrid.ac.uk voms03.gridpp.ac.uk 15018 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk vo.northgrid.ac.uk' "
 
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
 
  
 +
''' Filename: ''' /etc/vomses/vo.magrid.ma-voms.magrid.ma
 +
<pre><nowiki>
 +
"vo.magrid.ma" "voms.magrid.ma" "15001" "/C=MA/O=MaGrid/OU=CNRST/CN=voms.magrid.ma" "vo.magrid.ma"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|VO.SOUTHGRID.AC.UK |<!-- VOMS RECORDS for VO.SOUTHGRID.AC.UK -->
 
  
 +
{{BOX VO|VO.MOEDAL.ORG|<!-- VOMS RECORDS for VO.MOEDAL.ORG -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.moedal.org/lcg-voms2.cern.ch.lsc
 +
<pre><nowiki>
 +
/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/vo.moedal.org/voms2.cern.ch.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# VO_VO_SOUTHGRID_AC_UK_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/vo.southgrid.ac.uk?/vo.southgrid.ac.uk' "
+
/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
# VO_VO_SOUTHGRID_AC_UK_VOMSES="'vo.southgrid.ac.uk voms.gridpp.ac.uk 15019 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk vo.southgrid.ac.uk' 'vo.southgrid.ac.uk voms02.gridpp.ac.uk 15019 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk vo.southgrid.ac.uk' 'vo.southgrid.ac.uk voms03.gridpp.ac.uk 15019 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk vo.southgrid.ac.uk' "
+
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
# VO_VO_SOUTHGRID_AC_UK_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/vomses/vo.moedal.org-lcg-voms2.cern.ch
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/vo.southgrid.ac.uk
+
"vo.moedal.org" "lcg-voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "vo.moedal.org"
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/vo.southgrid.ac.uk?/vo.southgrid.ac.uk' "
+
</nowiki></pre>
VOMSES="'vo.southgrid.ac.uk voms.gridpp.ac.uk 15019 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk vo.southgrid.ac.uk' 'vo.southgrid.ac.uk voms02.gridpp.ac.uk 15019 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk vo.southgrid.ac.uk' 'vo.southgrid.ac.uk voms03.gridpp.ac.uk 15019 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk vo.southgrid.ac.uk' "
+
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
  
 +
''' Filename: ''' /etc/vomses/vo.moedal.org-voms2.cern.ch
 +
<pre><nowiki>
 +
"vo.moedal.org" "voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "vo.moedal.org"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|VO.LANDSLIDES.MOSSAIC.ORG |<!-- VOMS RECORDS for VO.LANDSLIDES.MOSSAIC.ORG -->
 
  
 +
{{BOX VO|VO.NORTHGRID.AC.UK|<!-- VOMS RECORDS for VO.NORTHGRID.AC.UK -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# VO_VO_LANDSLIDES_MOSSAIC_ORG_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/vo.landslides.mossaic.org?/vo.landslides.mossaic.org' 'vomss://voms02.gridpp.ac.uk:8443/voms/vo.landslides.mossaic.org?/vo.landslides.mossaic.org' 'vomss://voms03.gridpp.ac.uk:8443/voms/vo.landslides.mossaic.org?/vo.landslides.mossaic.org' "
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
# VO_VO_LANDSLIDES_MOSSAIC_ORG_VOMSES="'vo.landslides.mossaic.org voms.gridpp.ac.uk 15502 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk vo.landslides.mossaic.org' 'vo.landslides.mossaic.org voms02.gridpp.ac.uk 15502 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk vo.landslides.mossaic.org' 'vo.landslides.mossaic.org voms03.gridpp.ac.uk 15502 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk vo.landslides.mossaic.org' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
# VO_VO_LANDSLIDES_MOSSAIC_ORG_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/vo.landslides.mossaic.org
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/vo.landslides.mossaic.org?/vo.landslides.mossaic.org' 'vomss://voms02.gridpp.ac.uk:8443/voms/vo.landslides.mossaic.org?/vo.landslides.mossaic.org' 'vomss://voms03.gridpp.ac.uk:8443/voms/vo.landslides.mossaic.org?/vo.landslides.mossaic.org' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VOMSES="'vo.landslides.mossaic.org voms.gridpp.ac.uk 15502 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk vo.landslides.mossaic.org' 'vo.landslides.mossaic.org voms02.gridpp.ac.uk 15502 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk vo.landslides.mossaic.org' 'vo.landslides.mossaic.org voms03.gridpp.ac.uk 15502 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk vo.landslides.mossaic.org' "
+
</nowiki></pre>
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
  
 +
''' Filename: ''' /etc/vomses/vo.northgrid.ac.uk-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"vo.northgrid.ac.uk" "voms.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.northgrid.ac.uk"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/vo.northgrid.ac.uk-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"vo.northgrid.ac.uk" "voms02.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.northgrid.ac.uk"
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/vo.northgrid.ac.uk-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"vo.northgrid.ac.uk" "voms03.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.northgrid.ac.uk"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|ENMR.EU |<!-- VOMS RECORDS for ENMR.EU -->
 
  
 +
{{BOX VO|VO.SCOTGRID.AC.UK|<!-- VOMS RECORDS for VO.SCOTGRID.AC.UK -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# VO_ENMR_EU_VOMS_SERVERS="'vomss://voms2.cnaf.infn.it:8443/voms/enmr.eu?/enmr.eu' "
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
# VO_ENMR_EU_VOMSES="'enmr.eu voms-02.pd.infn.it 15014 /C=IT/O=INFN/OU=Host/L=Padova/CN=voms-02.pd.infn.it enmr.eu' 'enmr.eu voms2.cnaf.infn.it 15014 /C=IT/O=INFN/OU=Host/L=CNAF/CN=voms2.cnaf.infn.it enmr.eu' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
# VO_ENMR_EU_VOMS_CA_DN="'/C=IT/O=INFN/CN=INFN Certification Authority' '/C=IT/O=INFN/CN=INFN Certification Authority' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/enmr.eu
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
VOMS_SERVERS="'vomss://voms2.cnaf.infn.it:8443/voms/enmr.eu?/enmr.eu' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VOMSES="'enmr.eu voms-02.pd.infn.it 15014 /C=IT/O=INFN/OU=Host/L=Padova/CN=voms-02.pd.infn.it enmr.eu' 'enmr.eu voms2.cnaf.infn.it 15014 /C=IT/O=INFN/OU=Host/L=CNAF/CN=voms2.cnaf.infn.it enmr.eu' "
+
</nowiki></pre>
VOMS_CA_DN="'/C=IT/O=INFN/CN=INFN Certification Authority' '/C=IT/O=INFN/CN=INFN Certification Authority' "
+
  
 +
''' Filename: ''' /etc/vomses/vo.scotgrid.ac.uk-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"vo.scotgrid.ac.uk" "voms.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.scotgrid.ac.uk"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/vo.scotgrid.ac.uk-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"vo.scotgrid.ac.uk" "voms02.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.scotgrid.ac.uk"
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/vo.scotgrid.ac.uk-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"vo.scotgrid.ac.uk" "voms03.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.scotgrid.ac.uk"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|CERNATSCHOOL.ORG |<!-- VOMS RECORDS for CERNATSCHOOL.ORG -->
 
  
 +
{{BOX VO|VO.SOUTHGRID.AC.UK|<!-- VOMS RECORDS for VO.SOUTHGRID.AC.UK -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
''' site-info.def version (sid) '''
+
''' Filename: ''' /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# VO_CERNATSCHOOL_ORG_VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/cernatschool.org?/cernatschool.org' 'vomss://voms02.gridpp.ac.uk:8443/voms/cernatschool.org?/cernatschool.org' "
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
# VO_CERNATSCHOOL_ORG_VOMSES="'cernatschool.org voms.gridpp.ac.uk 15500 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk cernatschool.org' 'cernatschool.org voms02.gridpp.ac.uk 15500 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk cernatschool.org' 'cernatschool.org voms03.gridpp.ac.uk 15500 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk cernatschool.org' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
# VO_CERNATSCHOOL_ORG_VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
 
</nowiki></pre>
 
</nowiki></pre>
''' vo.d version (vod)'''
+
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
# $YAIM_LOCATION/vo.d/cernatschool.org
+
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
VOMS_SERVERS="'vomss://voms.gridpp.ac.uk:8443/voms/cernatschool.org?/cernatschool.org' 'vomss://voms02.gridpp.ac.uk:8443/voms/cernatschool.org?/cernatschool.org' "
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
VOMSES="'cernatschool.org voms.gridpp.ac.uk 15500 /C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk cernatschool.org' 'cernatschool.org voms02.gridpp.ac.uk 15500 /C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk cernatschool.org' 'cernatschool.org voms03.gridpp.ac.uk 15500 /C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk cernatschool.org' "
+
</nowiki></pre>
VOMS_CA_DN="'/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' '/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B' "
+
  
 +
''' Filename: ''' /etc/vomses/vo.southgrid.ac.uk-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"vo.southgrid.ac.uk" "voms.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.southgrid.ac.uk"
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/vo.southgrid.ac.uk-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"vo.southgrid.ac.uk" "voms02.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.southgrid.ac.uk"
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/vo.southgrid.ac.uk-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"vo.southgrid.ac.uk" "voms03.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.southgrid.ac.uk"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|EARTHSCIENCE <!-- NO XML --> (Note that this VO is not in the CIC Portal)|<pre><nowiki>
+
 
VOMS_SERVERS=TBD
+
{{BOX VO|ZEUS|<!-- VOMS RECORDS for ZEUS -->
VOMSES=TBD
+
''' Filename: ''' /etc/grid-security/vomsdir/zeus/grid-voms.desy.de.lsc
VOMS_CA_DN=TBD
+
<pre><nowiki>
 +
/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
 +
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
 
</nowiki></pre>
 
</nowiki></pre>
Notes:  
+
 
 +
''' Filename: ''' /etc/vomses/zeus-grid-voms.desy.de
 +
<pre><nowiki>
 +
"zeus" "grid-voms.desy.de" "15112" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "zeus"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
Line 1,388: Line 1,806:
 
<!-- END OF SIDSECTION -->
 
<!-- END OF SIDSECTION -->
  
'''NOTA BENE'''
+
 
Please do not change by hand the '''VO Resource Requirements''' table below, as it is automatically updated from the CIC Portal.
+
== Not Listed in the EGI Operations Portal ==
 +
 
 +
 
 +
{{BOX VO|PLANCK|<!-- VOMS RECORDS for PLANCK -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/planck/voms.cnaf.infn.it.lsc
 +
<pre><nowiki>
 +
/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it
 +
/C=IT/O=INFN/CN=INFN Certification Authority
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/planck-voms.cnaf.infn.it
 +
<pre><nowiki>
 +
"planck" "voms.cnaf.infn.it" "15002" "/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it" "planck"
 +
</nowiki></pre>
 +
 
 +
 
 +
Notes:
 +
n/a
 +
}}
  
 
== VO Resource Requirements ==
 
== VO Resource Requirements ==
{|border="1" cellpadding="1"
+
 
|+VO Resource Requirements
+
<div style="margin:auto; border:2px solid black;background-color:#EEEEEE;width:600px; max-width:97%">
 +
<div style="font-size:1.2em; font-weight:bold; padding-left:4px;background-color:#7C8AAF;color:#fff;">Please Note</div>
 +
<div style="padding:3px 6px">
 +
Please do not change the table below as it is automatically updated from the EGI Operations Portal.
 +
Any changes you make will be lost.
 +
</div>
 +
</div>
 +
 
 +
 
 +
<!-- START OF RESOURCES -->{|border="1" cellpadding="3" style="border-collapse:collapse;margin-bottom:40px;width:100%"
 +
<!-- |+VO Resource Requirements -->
 
|-style="background:#7C8AAF;color:white"
 
|-style="background:#7C8AAF;color:white"
 
!VO
 
!VO
Line 1,401: Line 1,847:
 
!Scratch
 
!Scratch
 
!Other
 
!Other
 
 
|-
 
|-
 
|alice
 
|alice
Line 1,409: Line 1,854:
 
|10000
 
|10000
 
|
 
|
 
 
 
|-
 
|-
 
|atlas
 
|atlas
 
|2048
 
|2048
|3120
+
|5760
 
|5760
 
|5760
 
|20000
 
|20000
 
|Additional runtime requirements:
 
|Additional runtime requirements:
_ at least 4GB of VM for each job slot
+
* at least 4GB of VM for each job slot
  
 
Software installation common items:
 
Software installation common items:
_ the full compiler suite (c/c++ and fortran) should be installed in the WNs, including all the compat-gcc-32* and the  SL_libg2c.a_change packages in SL4-like nodes;
+
* the full compiler suite (c/c++ and fortran) should be installed in the WNs, including all the compat-gcc-32* and the  SL_libg2c.a_change packages in SL4-like nodes;
_ the reccommended version of the compilers is 3.4.6;
+
* the reccommended version of the compilers is 3.4.6;
_ the f2c and libgfortran libraries (in both i386 and x86_64 versions, in case of x86_64 systems) are also required to run the software;
+
* the f2c and libgfortran libraries (in both i386 and x86_64 versions, in case of x86_64 systems) are also required to run the software;
_ other libraries required are:
+
* other libraries required are:
  libpopt.so.0
+
:libpopt.so.0
  libblas.so
+
:libblas.so
_ other applications required are: uuencode, uudecode, bc, curl;
+
* other applications required are: uuencode, uudecode, bc, curl;
_ high priority in the batch system for the atlassgm user;
+
* high priority in the batch system for the atlassgm user;
_ for nodes running at 64 bits, a copy of python compiled at 32 bits is also needed to use the 32 bits python bindings in the middleware. See https://twiki.cern.ch/twiki/bin/view/Atlas/RPMcompatSLC4 for more details;
+
* for nodes running at 64 bits, a copy of python compiled at 32 bits is also needed to use the 32 bits python bindings in the middleware. See https://twiki.cern.ch/twiki/bin/view/Atlas/RPMcompatSLC4 for more details;
_ for SL5 nodes please refer to https://twiki.cern.ch/twiki/bin/view/Atlas/RPMCompatSLC5 and https://twiki.cern.ch/twiki/bin/view/Atlas/SL5Migration ;
+
* for SL5 nodes please refer to https://twiki.cern.ch/twiki/bin/view/Atlas/RPMCompatSLC5 and https://twiki.cern.ch/twiki/bin/view/Atlas/SL5Migration ;
_ for SL6 nodes please refer to https://twiki.cern.ch/twiki/bin/view/AtlasComputing/RPMCompatSLC6 and https://twiki.cern.ch/twiki/bin/view/LCG/SL6Migration
+
* for SL6 nodes please refer to https://twiki.cern.ch/twiki/bin/view/AtlasComputing/RPMCompatSLC6 and https://twiki.cern.ch/twiki/bin/view/LCG/SL6Migration
  
 
Software installation setup (cvmfs sites):
 
Software installation setup (cvmfs sites):
_ https://twiki.cern.ch/twiki/bin/view/Atlas/CernVMFS
+
* https://twiki.cern.ch/twiki/bin/view/Atlas/CernVMFS
  
 
Software installation requirements (non-cvmfs sites):
 
Software installation requirements (non-cvmfs sites):
_ an experimental software area (shared filesystem) with at least 500 GB free and reserved for ATLAS.
+
* an experimental software area (shared filesystem) with at least 500 GB free and reserved for ATLAS.
 
+
 
+
 
|-
 
|-
 
|biomed
 
|biomed
Line 1,447: Line 1,888:
 
|100
 
|100
 
|For sites providing an SE, minimal required storage space is 1TB.
 
|For sites providing an SE, minimal required storage space is 1TB.
 
 
 
|-
 
|-
 
|calice
 
|calice
Line 1,457: Line 1,896:
 
|CVMFS is used for the software distribution via:
 
|CVMFS is used for the software distribution via:
  
  /cvmfs/calice.desy.de  
+
:/cvmfs/calice.desy.de  
  
 
For setup instructions refer to:
 
For setup instructions refer to:
  
  http://grid.desy.de/cvmfs
+
: http://grid.desy.de/cvmfs
 
+
 
+
|-
+
|camont
+
|1000
+
|600
+
|2880
+
|1
+
|
+
 
+
 
+
|-
+
|cdf
+
|0
+
|4320
+
|4320
+
|10
+
|yaim variables:
+
 
+
VO_CDF_VOMS_SERVERS="'vomss://voms.cnaf.infn.it:8443/voms/cdf?/cdf' 'vomss://voms-01.pd.infn.it:8443/voms/cdf?/cdf'"
+
VO_CDF_VOMSES="'cdf voms.cnaf.infn.it 15001 /C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it cdf' 'cdf voms-01.pd.infn.it 15001 /C=IT/O=INFN/OU=Host/L=Padova/CN=voms-01.pd.infn.it cdf'"
+
 
+
 
+
 
|-
 
|-
 
|cernatschool.org
 
|cernatschool.org
Line 1,492: Line 1,908:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
 
|cms
 
|cms
Line 1,500: Line 1,914:
 
|4320
 
|4320
 
|20000
 
|20000
|Note: The 'Resources' table from above is meant per core. CMS usually sends 8-core pilots.
+
|Note: CMS usually sends 8-core pilots, values for 'Multi Core' refer to that. Single-core pilots are discouraged.
  
 
Jobs require an address space larger than the memory size specified
 
Jobs require an address space larger than the memory size specified
above. Sites should allow processes to use at least one aditional
+
above. Sites should allow processes to use at least 6GB
GB of virtual address space more per core than memory to accomodate
+
of virtual address space more per core than memory to accommodate
 
the large amount of shared libraries used by jobs.
 
the large amount of shared libraries used by jobs.
(For a typical 8-core pilot that would translate into a VZSIZE limit of at least 24GB.)  
+
(For a typical 8-core pilot that would translate into a VZSIZE limit of at least 64GB.)  
  
 
Cloud resources should provision 8-core VMs to match standard 8-core pilots.
 
Cloud resources should provision 8-core VMs to match standard 8-core pilots.
Line 1,520: Line 1,934:
 
National VOMS groups:
 
National VOMS groups:
 
In CMS national VOMS groups, e.g. /cms/becms or /cms/dcms, are used. Those proxies must be "supported" at all sites in the following way:
 
In CMS national VOMS groups, e.g. /cms/becms or /cms/dcms, are used. Those proxies must be "supported" at all sites in the following way:
_ glexec must not fail
+
* should be treated like /cms (base group), in case no special treated is wanted by the site
_ should be treated like /cms (base group), in case no special treated is wanted by the site
+
* proxies with such national groups must be able to write to /store/user/temp (the PFN associated to this LFN)
_ proxies with such national groups must be able to write to /store/user/temp (the PFN associated to this LFN)
+
|-
 
+
|comet.j-parc.jp
 
+
|2048
 +
|1440
 +
|2880
 +
|40960
 +
|
 
|-
 
|-
 
|dteam
 
|dteam
Line 1,532: Line 1,950:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
|dzero
+
|dune
 
|0
 
|0
|1500
+
|2880
|24
+
|2880
|6
+
|10000
|Worker Nodes need outgoing internet access.
+
|
 
+
 
+
 
|-
 
|-
 
|enmr.eu
 
|enmr.eu
|1000
+
|8000
 
|2880
 
|2880
 
|4320
 
|4320
 
|1000
 
|1000
|1) The line:
+
|1) For COVID-19 related jobs, slots with 8 GB/Core are required
 +
 
 +
# WeNMR software area must be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS and https://wiki.egi.eu/wiki/PROC22. Please do not forget to define on all WNs the environment variable VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu, as pointed out in the above documents.
 +
 
 +
# The line:
 
"/enmr.eu/*"::::
 
"/enmr.eu/*"::::
 
has to be added to group.conf file before configuring via yaim the grid services.  
 
has to be added to group.conf file before configuring via yaim the grid services.  
Line 1,560: Line 1,978:
 
of the file /etc/grid-security/groupmapfile.
 
of the file /etc/grid-security/groupmapfile.
 
It is required to enable whatever VO group added for implementing per-application accounting.
 
It is required to enable whatever VO group added for implementing per-application accounting.
 
2) Further, multiple queues should ideally be enabled with different Job Wall Clock Time limits:
 
_ very short: 30 minutes max - for NAGIOS probes, that run with the VO FQAN:
 
/enmr.eu/ops/Role=NULL/Capability=NULL
 
_ short : 120 minutes max
 
_ medium : 12 hours max
 
_ long : 48 hours
 
 
3) A WeNMR supported application, Gromacs, run in multithreading mode on multiprocessor boxes (MPI not needed), as described in http://www.egi.eu/blog/2011/10/31/running_multiprocessor_jobs_on_the_grid.html.
 
Please inform the VO managers if your site does not support this kind of jobs.
 
 
4) WeNMR software area can be mounted on the WNs through CVMFS as decribed in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS
 
 
 
 
|-
 
|-
 
|epic.vo.gridpp.ac.uk
 
|epic.vo.gridpp.ac.uk
Line 1,581: Line 1,985:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
 
|esr
 
|esr
Line 1,598: Line 2,000:
 
No permanent storage needed but transient and durable.
 
No permanent storage needed but transient and durable.
 
Low-latency scheduling for short jobs needed.
 
Low-latency scheduling for short jobs needed.
 
 
 
|-
 
|-
 
|fermilab
 
|fermilab
Line 1,607: Line 2,007:
 
|0
 
|0
 
|
 
|
 
 
|-
 
|fusion
 
|0
 
|0
 
|0
 
|0
 
|Please see MoU for details. NOTE that the CPU numbers cannot be determined because of conflicting numbers in the MoU.
 
 
 
 
|-
 
|-
 
|geant4
 
|geant4
Line 1,630: Line 2,019:
 
CernVM-FS needs to be accessed on WN. CernVM-FS Cache area needed is about
 
CernVM-FS needs to be accessed on WN. CernVM-FS Cache area needed is about
 
5GB.
 
5GB.
 
 
 
|-
 
|-
 
|gridpp
 
|gridpp
Line 1,639: Line 2,026:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
 
|hyperk.org
 
|hyperk.org
|2000
+
|0
 
|1440
 
|1440
 
|1440
 
|1440
 
|10000
 
|10000
 
|
 
|
 
 
 
|-
 
|-
 
|icecube
 
|icecube
Line 1,659: Line 2,042:
  
 
/cvmfs/icecube.opensciencegrid.org
 
/cvmfs/icecube.opensciencegrid.org
 
 
 
|-
 
|-
 
|ilc
 
|ilc
Line 1,669: Line 2,050:
 
|CVMFS is used for the software distribution via:
 
|CVMFS is used for the software distribution via:
  
  /cvmfs/ilc.desy.de  
+
:/cvmfs/ilc.desy.de  
  
 
For setup instructions refer to:
 
For setup instructions refer to:
  
  http://grid.desy.de/cvmfs
+
: http://grid.desy.de/cvmfs
 
+
 
+
 
|-
 
|-
 
|ipv6.hepix.org
 
|ipv6.hepix.org
Line 1,683: Line 2,062:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
|vo.landslides.mossaic.org
+
|lhcb
 
|0
 
|0
 
|0
 
|0
 
|0
 
|0
|0
 
|
 
 
 
|-
 
|lhcb
 
|4000
 
|6000
 
|7200
 
 
|20000
 
|20000
 
|Further recommendations from LHCb for sites:
 
|Further recommendations from LHCb for sites:
Line 1,704: Line 2,072:
 
The amount of memory in the field "Max used physical non-swap X86_64 memory size" of the resources section is understood to be the virtual memory required per single process of a LHCb payload. Usually LHCb payloads consist of one "worker process", consuming the majority of memory, and several wrapper processes. The total amount of virtual memory for all wrapper processes accounts for 1 GB which needs to be added as a requirement to the field  "Max used physical non-swap X86_64 memory size" in case the virtual memory of the whole process tree is monitored.
 
The amount of memory in the field "Max used physical non-swap X86_64 memory size" of the resources section is understood to be the virtual memory required per single process of a LHCb payload. Usually LHCb payloads consist of one "worker process", consuming the majority of memory, and several wrapper processes. The total amount of virtual memory for all wrapper processes accounts for 1 GB which needs to be added as a requirement to the field  "Max used physical non-swap X86_64 memory size" in case the virtual memory of the whole process tree is monitored.
  
The amount of space in field "Max size of scratch space used by jobs", shall be interpreted as, 5 GB needed for local software installation, the remaining amount is needed 50 % each for downloaded input files and produced output files. T2 sites only providing Monte Carlo simulation will only need to provide the scratch space of local software installation.
+
The amount of space in field "Max size of scratch space used by jobs", shall be interpreted as 50 % each for downloaded input files and produced output files.  
  
The CPU limits are understood to be expressed in kSI2k.minutes
+
Sites should have the Centos7 or "Cern Centos7" operating system, or later versions, installed on their worker nodes. CPUs should support the x86_64_v2 instruction set (or later). Sites are requested to provide support for singularity containers and user namespaces. The latter can be checked by ensuring that /proc/sys/user/max_user_namespaces contains a large number.  
  
The shared software area shall be provided via CVMFS. LHCb uses the mount point /cvmfs/lhcb.cern.ch on the worker nodes.  
+
The underlying OS should provide the libraries, binaries, and scripts required by the current HEP_OSlibs RPM meta package.
 +
 
 +
The shared software area shall be provided via CVMFS. LHCb uses the mount points
 +
:      "/cvmfs/lhcb.cern.ch/",
 +
:      "/cvmfs/lhcb-condb.cern.ch/",
 +
:      "/cvmfs/lhcbdev.cern.ch/",
 +
:      "/cvmfs/unpacked.cern.ch/",
 +
:      "/cvmfs/cernvm-prod.cern.ch/",
 +
on the worker nodes.  
  
 
Provisioning of a reasonable number of slots per disk server, proportional to the maximum number of concurrent jobs at the site.
 
Provisioning of a reasonable number of slots per disk server, proportional to the maximum number of concurrent jobs at the site.
  
Advertisement of OS and machine capabilities in the BDII as described in https://wiki.egi.eu/wiki/HOWTO05 and https://wiki.egi.eu/wiki/HOWTO06 .
+
Non T1 sites providing CVMFS, direct HTCondorCE, ARC, or CREAM submission and the requested amount of local scratch space will be considered as candidates for additional workloads (e.g. data reprocessing campaign).
 
+
Separation of clusters running different OSes via different CEs.
+
 
+
Non T1 sites providing CVMFS, direct CREAM submission and the requested amount of local scratch space will be considered as candidates for additional workloads (e.g. data reprocessing campaign).
+
 
+
 
+
 
+
 
+
|-
+
|vo.londongrid.ac.uk
+
|2048
+
|1440
+
|1440
+
|2048
+
|The VO uses cvmfs to distribute its software:  /cvmfs/londongrid.gridpp.ac.uk
+
  
 +
Sites with disk storage must provide: 
 +
:- an xroot endpoint (single DNS entry), at least for reading
 +
:- an HTTPS endpoint (single DNS entry), both read and write, supporting Third Party Copy
 +
:- a way to do the accounting (preferably following the WLCG TF standard: https://twiki.cern.ch/twiki/bin/view/LCG/StorageSpaceAccounting)
  
 +
Sites with tape storage should be accessible from the other Tier1 and Tier2 sites. They should provide one of the supported WLCG tape systems (dCache or CTA). Tape classes to optimize data distribution is to be discusses on a per-site basis.
 
|-
 
|-
 
|lsst
 
|lsst
Line 1,738: Line 2,104:
 
|VO name must be "lsst" as it is an existing VO in OSG!
 
|VO name must be "lsst" as it is an existing VO in OSG!
 
cf VOMS URL
 
cf VOMS URL
 
 
 
|-
 
|-
 
|lz
 
|lz
Line 1,747: Line 2,111:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
 
|magic
 
|magic
Line 1,756: Line 2,118:
 
|0
 
|0
 
|Fortran77 and other compilers. See details in annex of MoU (documentation section).
 
|Fortran77 and other compilers. See details in annex of MoU (documentation section).
 
 
 
|-
 
|-
 
|mice
 
|mice
Line 1,765: Line 2,125:
 
|0
 
|0
 
|
 
|
 
 
|-
 
|vo.moedal.org
 
|0
 
|0
 
|0
 
|0
 
|
 
 
 
 
|-
 
|-
 
|na62.vo.gridpp.ac.uk
 
|na62.vo.gridpp.ac.uk
 
|2048
 
|2048
|300
+
|500
|600
+
|720
 
|2048
 
|2048
 
|VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.cern.ch
 
|VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.cern.ch
  
 
Need also access to /cvmfs/geant4.cern.ch and /cvmfs/sft.cern.ch
 
Need also access to /cvmfs/geant4.cern.ch and /cvmfs/sft.cern.ch
 
 
 
|-
 
|-
|neiss.org.uk
+
|ops
 
|0
 
|0
 
|0
 
|0
Line 1,794: Line 2,141:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
|vo.northgrid.ac.uk
+
|pheno
 
|0
 
|0
 
|0
 
|0
Line 1,803: Line 2,148:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
|ops
+
|skatelescope.eu
 
|0
 
|0
 
|0
 
|0
Line 1,812: Line 2,155:
 
|0
 
|0
 
|
 
|
           
 
 
 
|-
 
|pheno
 
|0
 
|0
 
|0
 
|0
 
|
 
 
 
|-
 
|planck
 
|0
 
|950
 
|0
 
|0
 
|Need access to job output during execution
 
Need R-GMA for monitoring
 
RAM 1GB
 
Scratch 200GB
 
SE for durable files (not permanent)
 
Java/Perl/Python/C/C++/Fortran90,-95,-77/Octave
 
IDL(commercial) where available
 
 
 
 
|-
 
|-
 
|snoplus.snolab.ca
 
|snoplus.snolab.ca
Line 1,852: Line 2,168:
  
 
SNO+ software area should be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS.
 
SNO+ software area should be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS.
 
 
 
|-
 
|-
|vo.southgrid.ac.uk
+
|solidexperiment.org
 
|0
 
|0
 
|0
 
|0
 
|0
 
|0
 
|0
 
|0
|
+
|will need to set up CVMFS.
 
+
 
+
|-
+
|supernemo.vo.eu-egee.org
+
|0
+
|0
+
|5760
+
|5
+
|
+
           
+
 
+
 
+
 
|-
 
|-
 
|t2k.org
 
|t2k.org
Line 1,880: Line 2,182:
 
|1000
 
|1000
 
|t2k.org software should be mounted on WNs via CVMFS as defined at https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS
 
|t2k.org software should be mounted on WNs via CVMFS as defined at https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS
 
+
|-
 
+
|virgo
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.complex-systems.eu
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.cta.in2p3.fr
 +
|0
 +
|0
 +
|2000
 +
|0
 +
|
 +
|-
 +
|vo.landslides.mossaic.org
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.magrid.ma
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.moedal.org
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.northgrid.ac.uk
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.scotgrid.ac.uk
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.southgrid.ac.uk
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 
|-
 
|-
 
|zeus
 
|zeus
Line 1,890: Line 2,253:
 
|CVMFS is used for the software distribution via:
 
|CVMFS is used for the software distribution via:
  
  /cvmfs/zeus.desy.de  
+
:/cvmfs/zeus.desy.de  
  
 
For setup instructions refer to:
 
For setup instructions refer to:
  
  http://grid.desy.de/cvmfs
+
: http://grid.desy.de/cvmfs
 +
|}
 +
<!-- END OF RESOURCES -->
  
  
|-style="background:#7C8AAF;color:white"
 
|Maximum:
 
|4000
 
|6000
 
|7200
 
|40000
 
|
 
  
|}
+
== VO Activity ==
==VO enablement  ==
+
  
The VOs that are enabled at each site are listed in a [http://pprc.qmul.ac.uk/~lloyd/gridpp/votable.html VO table].  
+
The VOs that are enabled at each site are listed in [https://pprc.qmul.ac.uk/~lloyd/ukmetrics/ukmetrics.php?page=vos this VO table].
                                                       
+
<!-- [https://gfe03.hep.ph.ic.ac.uk:4175/vosupport.html VO Supported in London (Auto Generated)]
+
  
[[Image:20070227-enabled-vos.PNG]] -->
 
  
  
Line 1,918: Line 2,272:
 
[[Category:VOMS]]
 
[[Category:VOMS]]
  
{{KeyDocs|responsible=Steve Jones|reviewdate=2016-10-20|accuratedate=2016-10-20|percentage=100}}
+
<!-- START UPDATE DATE -->{{KeyDocs|responsible=Gerard Hand|reviewdate=2024/04/23, 14:40:12|accuratedate=2024/04/23|percentage=100}}<!-- END UPDATE DATE -->

Latest revision as of 13:44, 23 April 2024

Introduction

The GridPP Project Management Board (PMB) has agreed that up to 10 % of GridPP's processing capability should be allocated for non-LHC work. VOs that access the Grid like this must become Approved VOs; policies for managing approved VOs are described here: Policies_for_GridPP_approved_VOs.

The tables below indicate VOs that the GridPP Project Management Board has approved, and the PMB encourages support for these VOs at all of its collaborating sites. Information about all European Grid Initiative (EGI), global and local VOs is given in the EGI Operations portal which is the main reference source for VO information (including VO manager, end-points, requirements etc.).


Please Note

Please do not change the vomsdir/ or vomses/ entries or the VO Resource Requirements section below, as they are automatically updated from the EGI Operations Portal. Any changes you make will be lost!


Approved VOs

Name Area Contact
alice The ALICE Collaboration is operating a dedicated heavy-ion detector to exploit the unique physics potential of nucleus-nucleus interactions at LHC energies. Our aim is to study the physics of strongly interacting matter at extreme energy densities, where the formation of a new phase of matter, the quark-gluon plasma, is expected. Latchezar.Betev@cern.ch
Maarten.Litmaath@cern.ch
costin.grigoras@cern.ch
bes Beijing Spectrometer (BES) is a general-purpose detector located in the interaction region of the BEPC storage ring, where the electron and positron beams collide. The BES Collaboration consists of approximately 200 physicists and engineers from 27 institutions in 4 countries.
biomed This VO covers the areas related to health and life sciences. Currently, it is divided into 3 sectors: medical imaging, bioinformatics and drug discovery. The VO is openly accessible to academics, and to private company for non-commercial purposes. glatard@creatis.insa-lyon.fr
jerome.pansanel@iphc.cnrs.fr
sorina.pop@creatis.insa-lyon.fr
glatard@creatis.insa-lyon.fr
calice CAlorimeter for the LInear Collider Experiment

A high granularity calorimeter optimised for the Particle Flow measurement of multi-jets final state at the International Linear Collider running at a center-of-mass between 90 GeV and 1 TeV.

thomas.hartmann@desy.de
andreas.gellrich@desy.de
cepc The Circular Electron Positron Collider (CEPC) is a large international scientific facility proposed by the Chinese particle physics community in 2012
cernatschool.org The CERN@school VO represents the CERN@school project on the Grid. CERN@school aims to bring CERN technology into the classroom to aid with the teaching of physics and to inspire the next generation of scientists and engineers. The CERN@school VO will allow students and teachers involved with the project to harness GridPP to store and analyse data from the CERN@school detectors, the LUCID experiment and the associated GEANT4 simulations.
clas12
vo.complex-systems.eu The goal of the vo.complex-systems.eu is to promote the study of

complex systems and complex networks on the Grid infrastructure. The vo.complex-systems.eu Virtual Organization will also serve as the building layer of collaboration among international scientists focusing on the research area of Complexity Science.

romain.reuillon@iscpif.fr
comet.j-parc.jp Muon-to-electron conversion experiment at J-PARC, which will be used by international COMET collaborators for design studies and data analysis. COMET will test Beyond-the-Standard-Model physics in a way that is complementary to the experiments at the LHC. daniela.bauer@imperial.ac.uk
Yoshi.Uchida@imperial.ac.uk
simon.fayer05@imperial.ac.uk
dteam The goal of the VO is to facilitate the deployment of a stable production Grid infrastructure. To this end, members of this VO (who have to be associated with a registered site and be involved in its operation) are allowed to run tests to validate the correct configuration of their site. Site performance evaluation and/or monitoring programs may also be run under the DTEAM VO with the approval of the Site Manager, subject to the agreement of the affected sites' management. kkoum@admin.grnet.gr
alessandro.paolini@egi.eu
matthew.viljoen@egi.eu
kyrginis@admin.grnet.gr
enmr.eu Structural biology and life sciences in general, and NMR in particular, have always been associated with advanced computing. The current challenges in the post-genomic era call for virtual research platforms that provide the worldwide research community with both user-friendly tools, platforms for data analysis and exchange, and an underlying e-infrastructure. WeNMR groups different research teams into a worldwide virtual research community. It builds on the established eNMR e-Infrastructure and its steadily growing virtual organization, which is currently the second largest VO in the area of life sciences. WeNMR provides an e-Infrastructure platform and Science Gateway for structural biology towards EGI for the users of existing infrastructures. It involves researchers from around the world and will build bridges to other areas of structural biology. Integration with SAXS, a rapidly growing and highly complementary method, is directly included in WeNMR, but links will also be established to related initiatives. WeNMR will serve all relevant INSTRUCT communities in line with the ESFRI roadmap. Marco.Verlato@pd.infn.it
a.m.j.j.bonvin@uu.nl
rosato@cerm.unifi.it
giachetti@cerm.unifi.it
verlato@infn.it
epic.vo.gridpp.ac.uk EPIC replaces an earlier EPIC project that was focused upon Veterinary Surveillance (Phase 1). This new consortium EPIC project aims to become a world leader in policy linked research and includes some of Scotland’s leading veterinary epidemiologists and scientists.

The overarching purpose for the Centre is to provide access to high quality advice and analyses on the epidemiology and control of animal diseases that are important to Scotland, and to best prepare Scotland for the next major disease incursion. Ultimately, this strategic advice to the Scottish Government will help ensure that the interests of the various stakeholders involved in disease emergency planning and response are met as effectively as possible. This all must be achieved within the context of our rapidly changing environment. For example, issues such as climate change are now influencing the livestock disease risks that Scotland faces.

thomas.doherty@glasgow.ac.uk
esr The Earth Science Research covers research in the fields of Solid Earth, Ocean, Atmosphere and their interfaces. A large variety of communities correspond to each domain, some of them covering several domains.


In the ESR Virtual Organization (ESR-VO) four domains are represented:

  1. Earth Observation
  2. Climate
  3. Hydrology
  4. Solid Earth Physics
andre.gemuend@scai.fraunhofer.de
weissenb@ccr.jussieu.fr
weissenb@ccr.jussieu.fr
weissenb@ccr.jussieu.fr
fermilab Fermilab Virtual Organization (VO) - The Fermilab VO is an "umbrella" VO that includes the Fermilab Campus Grid (FermiGrid) and Fermilab Grid Testing (ITB) infrastructures, and all Fermilab computing activities that are not big enough to have their own Virtual Organization. Broadly these include the intensity frontier program, theoretical simulations, fixed target analysis, and accelerator and beamline design as well as activities performed by the Fermilab Campus Grid administrators. garzoglio@fnal.gov
boyd@fnal.gov
geant4 Geant4 is a toolkit for the simulation of the passage of particles through matter. Its areas of application include high energy, nuclear and accelerator physics, as well as studies in medical and space science. The two main reference papers for Geant4 are published in Nuclear Instruments and Methods in Physics Research A 506 (2003) 250-303, and IEEE Transactions on Nuclear Science 53 No. 1 (2006) 270-278. Andrea.Sciaba@cern.ch
Andrea.Sciaba@cern.ch
Andrea.Dotti@cern.ch
gridpp GridPP is a collaboration of particle physicists and computer scientists from the UK and CERN. They are building a distributed computing Grid across the UK for particle physicists. At the moment there is a working particle physics Grid across 17 UK institutions. m.doidge@lancaster.ac.uk
hyperk.org We propose the Hyper-Kamiokande (Hyper-K) detector as a next generation underground water Cherenkov detector. It will serve as a far detector of a long baseline neutrino oscillation experiment envisioned for the upgraded J-PARC, and as a detector capable of observing -- far beyond the sensitivity of the Super-Kamiokande (Super-K) detector -- proton decays, atmospheric neutrinos, and neutrinos from astronomical origins. The baseline design of Hyper-K is based on the highly successful Super-K, taking full advantage of a well-proven technology. " C.J.Walker@qmul.ac.uk
francesca.di_lodovico@kcl.ac.uk
icecube The goal of the VO is to enable the usage of Grid resources for ICECUBE collaboration members, mainly for simulation and reconstruction. thomas.hartmann@desy.de
andreas.gellrich@desy.de
andreas.haupt@desy.de
ilc VO for the International Linear Collider Community. thomas.hartmann@desy.de
andreas.gellrich@desy.de
Christoph.Wissing@desy.de
ipv6.hepix.org The goal of the VO is to carry out testing of IPv6 readiness, functionality and performance of the middleware, applications and tools required by the stakeholder communities, especially HEP. Other authorised activities include use of the testbed by related IPv6 activities inside EGI, the related middleware technology providers and other Infrastructures used by WLCG/HEP. david.kelsey@stfc.ac.uk
lz This VO will support LUX Zeplin experiment designed to search Dark Matter. E.Korolkova@sheffield.ac.uk
j.dobson@ucl.ac.uk
magic MAGIC is a system of two imaging atmospheric Cherenkov telescopes (or IACTs). MAGIC-I started routine operation after commissioning in 2004. Construction of MAGIC-II was completed in early 2009, and the two telescopes have been in operation ever since, with a break in 2012 for an upgrade that achieved full homogeneity. The project is funded primarily by the funding agencies BMFB (Germany), MPG (Germany), INFN (Italy), MICINN(Spain), and the ETH Zurich (Switzerland). neissner@pic.es
contrera@gae.ucm.es
rfirpo@pic.es
vo.magrid.ma VO vo.magrid.ma is a multidisciplinary VO providing general grid services and support to Moroccan scientific community rahim@cnrst.ma
mice A VO to support the activities of the Muon Ionisation Cooling Experiment (MICE). Specifically it is to enable the moving of MICE data around the Grid followed by the submission of analysis to these data. This is expected to be a small VO. d.colling@imperial.ac.uk
p.hodgson@sheffield.ac.uk
daniela.bauer@imperial.ac.uk
janusz.martyniak@imperial.ac.uk
uboone MicroBooNE is a large 170-ton liquid-argon time projection chamber (LArTPC) neutrino experiment located on the Booster neutrino beamline at Fermilab
mu3e The Mu3e experiment is a new search for the lepton-flavour violating decay of a positive muon into two positrons and one electron.
na62.vo.gridpp.ac.uk The NA62 VO (na62.vo.gridpp.ac.uk) is meant to provide grid computing and data storage resources to the NA62 collaboration. The NA62 VO is supported by University of Cambridge, University of Glasgow, Imperial College London, University of Birmingham, University of Lancaster, University of Liverpool, University of Manchester, Oxford University and RAL (from UK), CERN, CNAF (Italy) and UCL (Belgium). More info about the NA62 experiment can be found on http://na62.web.cern.ch/na62/. The production portal is located at http://na62.gla.ac.uk/ Dan.Protopopescu@glasgow.ac.uk
David.Britton@glasgow.ac.uk
ops The goal of the VO is to facilitate the operations of the LCG/EGI infrastructure, which includes running official monitoring, re-certification and performance evaluation tools. Additionally the VO will be used for interoperations with other grid infrastructures. eimamagi@srce.hr
alessandro.paolini@egi.eu
pheno Phenogrid is the VO for UK theorists that don't fit within one of the LHC experiments (e.g. developers of Monte Carlos etc.) The rest of this text exists only to satisfy the extremely unnecessary minimum limit of 200 characters. jeppe.andersen@durham.ac.uk
adam.j.boutcher@durham.ac.uk
paul.clark@durham.ac.uk
snoplus.snolab.ca VO for the snoplus experiment, a multi-purpose liquid scintillator neutrino experiment based in Sudbury, Canada. Members of the snoplus virtual organisation will contribute to the European computing effort to accurately simulate the SNOplus detector response. Jeanne.wilson@kcl.ac.uk
C.J.Walker@qmul.ac.uk
m.mottram@qmul.ac.uk
solidexperiment.org support grid user of the SoLid experiment. daniela.bauer@imperial.ac.uk
antonin.vacheret@imperial.ac.uk
t2k.org T2K is a neutrino experiment designed to investigate how neutrinos change from one flavour to another as they travel (neutrino oscillations). An intense beam of muon neutrinos is generated at the J-PARC nuclear physics site on the East coast of Japan and directed across the country to the Super-Kamiokande neutrino detector in the mountains of western Japan. The beam is measured once before it leaves the J-PARC site, using the near detector ND280, and again at Super-K: the change in the measured intensity and composition of the beam is used to provide information on the properties of neutrinos. sophie.king@kcl.ac.uk
tomislav.vladisavljevic@stfc.ac.uk
virgo Scientific target: detection of gravitational waves. Gravitational waves are predicted by the General Theory of Relativity but still not directly detected due to their extremely weak interaction with matter. Large interferometric detectors, like Virgo, are operating with the aim of directly detecting gravitational signals from various astrophysical sources. Signals are expected to be deeply buried into detector noise and suitable data analysis algorithm are developed in order to allow detection and signal parameter estimation. For many kind of searches large computing resources are needed and in some important cases we are computationally bound: the larger is the available computing power and the wider is the portion of source parameter space that can be explored.

VO target: to allow data management and computationally intensive data analysis

cristiano.palomba@roma1.infn.it
alberto.colla@roma1.infn.it
vo.landslides.mossaic.org A virtual organisation for landslide modellers associated with the Management of Slope Stability in Communities (MoSSaiC) project. The VO is used for running landslide modelling software such as CHASM and QUESTA. l.kreczko@bristol.ac.uk
vo.moedal.org The MoEDAL VO allows members of the MoEDAL Collaboration to perform all of the computing activities relevant for the MoEDAL experiment, making use of available resources according to the policy defined by the Collaboration. t.whyntie@qmul.ac.uk
daniel.felea@cern.ch
vo.northgrid.ac.uk Regional Virtual Organisation created to allow access to HEP resources to other local disciplines from Northgrid sites: Manchester, Lancaster, Liverpool, Sheffield. Users from these universities can apply. alessandra.forti@cern.ch
robert.frank@manchester.ac.uk
robert.frank@manchester.ac.uk
vo.scotgrid.ac.uk The VO is for academic and other users in Scotland to test access to EGI resources. Users will join this VO before deciding whether to setup one of their own for long term access. It is also designed as a test VO to allow maintenance and operational testing of site services. garth.roy@glasgow.ac.uk
vo.southgrid.ac.uk The VO is for academic and other users in the SouthGrid (UKI-SOUTHGRID-BHAM-HEP,UKI-SOUTHGRID-BRIS-HEP,UKI-SOUTHGRID-CAM-HEP,UKI-SOUTHGRID-OX-HEP,UKI-SOUTHGRID-RALPP, UKI-SOUTHGRID-SUSX) region to test access to EGI resources. Users will join this VO before deciding

whether to setup one of their own for long term access.

pete.gronbech@physics.ox.ac.uk
zeus ZEUS is a collaboration of about 450 physicists who are running a large particle detector at the electron-proton collider HERA at the DESY laboratory in Hamburg. The ZEUS detector is a sophisticated tool for studying the particle reactions provided by the high-energetic beams of the HERA accelerator. Thus the participating scientists are pushing forward our knowledge of the fundamental particles and forces of nature, gaining unsurpassed insight into the exciting laws of the microcosm. thomas.hartmann@desy.de
andreas.gellrich@desy.de

IRIS Partners

Name Area Contact
atlas The ATLAS VO allow the members of the ATLAS collaboration to perform all the computing activities relevant for the ATLAS experiment, making use of the available resources following the policy defined by the Collaboration. Alessandro.DeSalvo@roma1.infn.it
Elisabetta.Vilucchi@lnf.infn.it
jd@bnl.gov
james.william.walder@cern.ch
cms The Compact Muon Solenoid (CMS) experiment is a large general-purpose particle physics detectors built on the proton-proton Large Hadron Collider (LHC) at CERN in Switzerland. Andreas.Pfeiffer@cern.ch
stefano.belforte@cern.ch
stefano.belforte@ts.infn.it
Daniele.Bonacorsi@bo.infn.it
Christoph.Wissing@desy.de
sexton@gmail.com
lammel@fnal.gov
jose.hernandez@ciemat.es
Daniele.Bonacorsi@bo.infn.it
gutsche@fnal.gov
Andrea.Sciaba@cern.ch
vo.cta.in2p3.fr Monte Carlo simulations production and analysis for the "CTA - Cherenkov Telescopes Array"

international consortium.

cecile.barbier@lapp.in2p3.fr
arrabito@in2p3.fr
dune DUNE is the Deep Underground Neutrino Experiment managed by the global DUNE collaboration and hosted at Fermilab. We are building a deep-underground Liquid-Argon based neutrino detector to study accelerator-based neutrino oscillations, supernova neutrinos, and nucleon decay. andrew.mcnab@cern.ch
timm@fnal.gov
lhcb The LHCb (Large Hadron Collider Beauty) experiment is mainly set on finding the solution to the mistery of the matter-antimatter imbalance in the Universe. andrew.mcnab@cern.ch
concezio.bozzi@cern.ch
christophe.denis.haen@cern.ch
jan.van.eldik@cern.ch
joel.closier@cern.ch
ben.couturier@cern.ch
joel.closier@cern.ch
lsst Large Synoptic Survey Telescope or LSST is a large aperture wide field survey telescope and 3200 Megapixel camera to image faint astronomical objects, rapidly scan the sky and observe probes for dark matter and dark enegy. LSST Data Management and Simulation jobs will run on OSG and EGI. boutigny@in2p3.fr
IGoodenow@lsst.org
fabio@in2p3.fr
yangw@SLAC.stanford.edu
kherner@fnal.gov
skatelescope.eu The Square Kilometre Array (SKA) project is an international effort to build the world’s largest radio telescope, with eventually over a square kilometre (one million square metres) of collecting area. The scale of the SKA represents a huge leap forward in both engineering and research & development towards building and delivering a unique instrument, with the detailed design and preparation now well under way. As one of the largest scientific endeavours in history, the SKA will bring together a wealth of the world’s finest scientists, engineers and policy makers to bring the project to fruition.

The vo skatelescope.eu is the vo supporting this project.

alessandra.forti@cern.ch
andrew.mcnab@cern.ch
rohini.joshi@manchester.ac.uk
eucliduk.net The Euclid mission aims at understanding why the expansion of the Universe is accelerating and what is the nature of the source responsible for this acceleration which physicists refer to as dark energy. msh@roe.ac.uk

Approved VOs being established into GridPP infrastructure

As part of its commitment to various projects, the GridPP PMB has approved the establishment of the following VOs (your site can not yet support these but when the VO is setup and functioning we will let you know.)

Name Area Contact

VOs that have been removed from approved list

The table below comprises a history of VOs that have been removed from the approved list for various reasons.

Name Date of removal Notes
babar 9 Oct 2013 none
camont 7th June 2017 none
camont.gridpp.ac.uk 9 Oct 2013 none
cdf 7th June 2017 none
cedar 9 Oct 2013 none
dzero 7th June 2017 none
fusion 30 Jan 2017 Discussion with Rubén Vallés Pérez. VO appears defunct.
hone 24 Nov 2015 Discussed at Ops Meeting. Defunct.
ltwo 9 Oct 2013 none
minos.vo.gridpp.ac.uk 9 Oct 2013 none
na48 9 Oct 2013 none
neiss 7th June 2017 none


ngs.ac.uk 9 Oct 2013 none
superbvo.org 19 Jan 2016 Discussed at Ops Meeting. Defunct.
supernemo.vo.eu-egee.org 24 Feb 2020 now called supernemo.org
totalep 9 Oct 2013 none
vo.londongrid.ac.uk in progress [GGUS] VO not used any more
vo.sixt.cern.ch 11 Nov 2015 No members, no voms servers, defunct

Example site-info.def entries

The examples of site-info.def entries for yaim have been moved: Example site-info.def entries

Please Note

Please do not change the vomsdir/ or vomses/ entries below, as they are automatically updated from the EGI Operations Portal. Any changes you make will be lost!


Virtual Organisation: ALICE

Filename: /etc/grid-security/vomsdir/alice/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/alice/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/alice/voms-alice-auth.app.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=alice-auth.web.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/alice-lcg-voms2.cern.ch

"alice" "lcg-voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "alice"

Filename: /etc/vomses/alice-voms2.cern.ch

"alice" "voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "alice"

Filename: /etc/vomses/alice-voms-alice-auth.app.cern.ch

"alice" "voms-alice-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=alice-auth.web.cern.ch" "alice"

Notes: n/a


Virtual Organisation: ATLAS

Filename: /etc/grid-security/vomsdir/atlas/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/atlas/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/atlas/voms-atlas-auth.app.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/atlas-lcg-voms2.cern.ch

"atlas" "lcg-voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "atlas"

Filename: /etc/vomses/atlas-voms2.cern.ch

"atlas" "voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "atlas"

Filename: /etc/vomses/atlas-voms-atlas-auth.app.cern.ch

"atlas" "voms-atlas-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch" "atlas"

Notes: n/a


Virtual Organisation: BES

Filename: /etc/grid-security/vomsdir/bes/voms.ihep.ac.cn.lsc

/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn
/C=CN/O=HEP/CN=Institute of High Energy Physics Certification Authority

Filename: /etc/vomses/bes-voms.ihep.ac.cn

"bes" "voms.ihep.ac.cn" "15001" "/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn" "bes"

Notes: n/a


Virtual Organisation: BIOMED

Filename: /etc/grid-security/vomsdir/biomed/cclcgvomsli01.in2p3.fr.lsc

/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/biomed-cclcgvomsli01.in2p3.fr

"biomed" "cclcgvomsli01.in2p3.fr" "15000" "/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr" "biomed"

Notes: n/a


Virtual Organisation: CALICE

Filename: /etc/grid-security/vomsdir/calice/grid-voms.desy.de.lsc

/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/calice-grid-voms.desy.de

"calice" "grid-voms.desy.de" "15102" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "calice"

Notes: n/a


Virtual Organisation: CEPC

Filename: /etc/grid-security/vomsdir/cepc/voms.ihep.ac.cn.lsc

/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn
/C=CN/O=HEP/CN=Institute of High Energy Physics Certification Authority

Filename: /etc/vomses/cepc-voms.ihep.ac.cn

"cepc" "voms.ihep.ac.cn" "15005" "/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn" "cepc"

Notes: n/a


Virtual Organisation: CERNATSCHOOL.ORG

Filename: /etc/grid-security/vomsdir/cernatschool.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/cernatschool.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/cernatschool.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/cernatschool.org-voms.gridpp.ac.uk

"cernatschool.org" "voms.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "cernatschool.org"

Filename: /etc/vomses/cernatschool.org-voms02.gridpp.ac.uk

"cernatschool.org" "voms02.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "cernatschool.org"

Filename: /etc/vomses/cernatschool.org-voms03.gridpp.ac.uk

"cernatschool.org" "voms03.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "cernatschool.org"

Notes: n/a


Virtual Organisation: CLAS12

Notes: n/a


Virtual Organisation: CMS

Filename: /etc/grid-security/vomsdir/cms/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/cms/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/cms/voms-cms-auth.app.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/cms-lcg-voms2.cern.ch

"cms" "lcg-voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "cms"

Filename: /etc/vomses/cms-voms2.cern.ch

"cms" "voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "cms"

Filename: /etc/vomses/cms-voms-cms-auth.app.cern.ch

"cms" "voms-cms-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch" "cms"

Notes: n/a


Virtual Organisation: COMET.J-PARC.JP

Filename: /etc/grid-security/vomsdir/comet.j-parc.jp/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/comet.j-parc.jp/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/comet.j-parc.jp/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/comet.j-parc.jp-voms.gridpp.ac.uk

"comet.j-parc.jp" "voms.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "comet.j-parc.jp"

Filename: /etc/vomses/comet.j-parc.jp-voms02.gridpp.ac.uk

"comet.j-parc.jp" "voms02.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "comet.j-parc.jp"

Filename: /etc/vomses/comet.j-parc.jp-voms03.gridpp.ac.uk

"comet.j-parc.jp" "voms03.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "comet.j-parc.jp"

Notes: n/a


Virtual Organisation: DTEAM

Filename: /etc/grid-security/vomsdir/dteam/voms2.hellasgrid.gr.lsc

/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr
/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016

Filename: /etc/vomses/dteam-voms2.hellasgrid.gr

"dteam" "voms2.hellasgrid.gr" "15004" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "dteam"

Notes: n/a


Virtual Organisation: DUNE

Filename: /etc/grid-security/vomsdir/dune/voms1.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3

Filename: /etc/grid-security/vomsdir/dune/voms2.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3

Filename: /etc/vomses/dune-voms1.fnal.gov

"dune" "voms1.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov" "dune"

Filename: /etc/vomses/dune-voms2.fnal.gov

"dune" "voms2.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov" "dune"

Notes: n/a


Virtual Organisation: ENMR.EU

Filename: /etc/grid-security/vomsdir/enmr.eu/voms2.cnaf.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/enmr.eu-voms2.cnaf.infn.it

"enmr.eu" "voms2.cnaf.infn.it" "15014" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it" "enmr.eu"

Notes: n/a


Virtual Organisation: EPIC.VO.GRIDPP.AC.UK

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15507" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms02.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms03.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Notes: n/a


Virtual Organisation: ESR

Filename: /etc/grid-security/vomsdir/esr/voms.grid.sara.nl.lsc

/O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl
/C=NL/O=NIKHEF/CN=NIKHEF medium-security certification auth

Filename: /etc/vomses/esr-voms.grid.sara.nl

"esr" "voms.grid.sara.nl" "30001" "/O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl" "esr"

Notes: n/a


Virtual Organisation: EUCLIDUK.NET

Filename: /etc/grid-security/vomsdir/eucliduk.net/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/eucliduk.net-voms.gridpp.ac.uk

"eucliduk.net" "voms.gridpp.ac.uk" "15518" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "eucliduk.net"

Notes: n/a


Virtual Organisation: FERMILAB

Filename: /etc/grid-security/vomsdir/fermilab/voms1.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3

Filename: /etc/grid-security/vomsdir/fermilab/voms2.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3

Filename: /etc/vomses/fermilab-voms1.fnal.gov

"fermilab" "voms1.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov" "fermilab"

Filename: /etc/vomses/fermilab-voms2.fnal.gov

"fermilab" "voms2.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov" "fermilab"

Notes: n/a


Virtual Organisation: GEANT4

Filename: /etc/grid-security/vomsdir/geant4/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/geant4/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/geant4-lcg-voms2.cern.ch

"geant4" "lcg-voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "geant4"

Filename: /etc/vomses/geant4-voms2.cern.ch

"geant4" "voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "geant4"

Notes: n/a


Virtual Organisation: GRIDPP

Filename: /etc/grid-security/vomsdir/gridpp/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/gridpp/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/gridpp/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/gridpp-voms.gridpp.ac.uk

"gridpp" "voms.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "gridpp"

Filename: /etc/vomses/gridpp-voms02.gridpp.ac.uk

"gridpp" "voms02.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "gridpp"

Filename: /etc/vomses/gridpp-voms03.gridpp.ac.uk

"gridpp" "voms03.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "gridpp"

Notes: n/a


Virtual Organisation: HYPERK.ORG

Filename: /etc/grid-security/vomsdir/hyperk.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/hyperk.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/hyperk.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/hyperk.org-voms.gridpp.ac.uk

"hyperk.org" "voms.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "hyperk.org"

Filename: /etc/vomses/hyperk.org-voms02.gridpp.ac.uk

"hyperk.org" "voms02.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "hyperk.org"

Filename: /etc/vomses/hyperk.org-voms03.gridpp.ac.uk

"hyperk.org" "voms03.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "hyperk.org"

Notes: n/a


Virtual Organisation: ICECUBE

Filename: /etc/grid-security/vomsdir/icecube/grid-voms.desy.de.lsc

/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/icecube-grid-voms.desy.de

"icecube" "grid-voms.desy.de" "15106" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "icecube"

Notes: n/a


Virtual Organisation: ILC

Filename: /etc/grid-security/vomsdir/ilc/grid-voms.desy.de.lsc

/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/ilc-grid-voms.desy.de

"ilc" "grid-voms.desy.de" "15110" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "ilc"

Notes: n/a


Virtual Organisation: IPV6.HEPIX.ORG

Filename: /etc/grid-security/vomsdir/ipv6.hepix.org/voms2.cnaf.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/ipv6.hepix.org-voms2.cnaf.infn.it

"ipv6.hepix.org" "voms2.cnaf.infn.it" "15013" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it" "ipv6.hepix.org"

Notes: n/a


Virtual Organisation: LHCB

Filename: /etc/grid-security/vomsdir/lhcb/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/lhcb/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/lhcb-lcg-voms2.cern.ch

"lhcb" "lcg-voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "lhcb"

Filename: /etc/vomses/lhcb-voms2.cern.ch

"lhcb" "voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "lhcb"

Notes: n/a


Virtual Organisation: LSST

Filename: /etc/grid-security/vomsdir/lsst/voms.slac.stanford.edu.lsc

/DC=org/DC=incommon/C=US/ST=California/O=Stanford University/CN=voms.slac.stanford.edu
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3

Filename: /etc/vomses/lsst-voms.slac.stanford.edu

"lsst" "voms.slac.stanford.edu" "15003" "/DC=org/DC=incommon/C=US/ST=California/O=Stanford University/CN=voms.slac.stanford.edu" "lsst"

Notes: n/a


Virtual Organisation: LZ

Filename: /etc/grid-security/vomsdir/lz/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/lz/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/lz-voms.gridpp.ac.uk

"lz" "voms.gridpp.ac.uk" "15517" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "lz"

Filename: /etc/vomses/lz-voms02.gridpp.ac.uk

"lz" "voms02.gridpp.ac.uk" "15517" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "lz"

Notes: n/a


Virtual Organisation: MAGIC

Notes: n/a


Virtual Organisation: MICE

Filename: /etc/grid-security/vomsdir/mice/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/mice/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/mice/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/mice-voms.gridpp.ac.uk

"mice" "voms.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "mice"

Filename: /etc/vomses/mice-voms02.gridpp.ac.uk

"mice" "voms02.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "mice"

Filename: /etc/vomses/mice-voms03.gridpp.ac.uk

"mice" "voms03.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "mice"

Notes: n/a


Virtual Organisation: MU3E

Filename: /etc/grid-security/vomsdir/mu3e/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/mu3e-voms.gridpp.ac.uk

"mu3e" "voms.gridpp.ac.uk" "15516" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "mu3e"

Notes: n/a


Virtual Organisation: NA62.VO.GRIDPP.AC.UK

Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms.gridpp.ac.uk

"na62.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"

Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms02.gridpp.ac.uk

"na62.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"

Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms03.gridpp.ac.uk

"na62.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"

Notes: n/a


Virtual Organisation: OPS

Filename: /etc/grid-security/vomsdir/ops/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/ops/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/ops-lcg-voms2.cern.ch

"ops" "lcg-voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "ops"

Filename: /etc/vomses/ops-voms2.cern.ch

"ops" "voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "ops"

Notes: n/a


Virtual Organisation: PHENO

Filename: /etc/grid-security/vomsdir/pheno/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/pheno/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/pheno/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/pheno-voms.gridpp.ac.uk

"pheno" "voms.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "pheno"

Filename: /etc/vomses/pheno-voms02.gridpp.ac.uk

"pheno" "voms02.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "pheno"

Filename: /etc/vomses/pheno-voms03.gridpp.ac.uk

"pheno" "voms03.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "pheno"

Notes: n/a


Virtual Organisation: SKATELESCOPE.EU

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/skatelescope.eu-voms.gridpp.ac.uk

"skatelescope.eu" "voms.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "skatelescope.eu"

Filename: /etc/vomses/skatelescope.eu-voms02.gridpp.ac.uk

"skatelescope.eu" "voms02.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "skatelescope.eu"

Filename: /etc/vomses/skatelescope.eu-voms03.gridpp.ac.uk

"skatelescope.eu" "voms03.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "skatelescope.eu"

Notes: n/a


Virtual Organisation: SNOPLUS.SNOLAB.CA

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/snoplus.snolab.ca-voms.gridpp.ac.uk

"snoplus.snolab.ca" "voms.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "snoplus.snolab.ca"

Filename: /etc/vomses/snoplus.snolab.ca-voms02.gridpp.ac.uk

"snoplus.snolab.ca" "voms02.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "snoplus.snolab.ca"

Filename: /etc/vomses/snoplus.snolab.ca-voms03.gridpp.ac.uk

"snoplus.snolab.ca" "voms03.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "snoplus.snolab.ca"

Notes: n/a


Virtual Organisation: SOLIDEXPERIMENT.ORG

Filename: /etc/grid-security/vomsdir/solidexperiment.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/solidexperiment.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/solidexperiment.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/solidexperiment.org-voms.gridpp.ac.uk

"solidexperiment.org" "voms.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "solidexperiment.org"

Filename: /etc/vomses/solidexperiment.org-voms02.gridpp.ac.uk

"solidexperiment.org" "voms02.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "solidexperiment.org"

Filename: /etc/vomses/solidexperiment.org-voms03.gridpp.ac.uk

"solidexperiment.org" "voms03.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "solidexperiment.org"

Notes: n/a


Virtual Organisation: T2K.ORG

Filename: /etc/grid-security/vomsdir/t2k.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/t2k.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/t2k.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/t2k.org-voms.gridpp.ac.uk

"t2k.org" "voms.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "t2k.org"

Filename: /etc/vomses/t2k.org-voms02.gridpp.ac.uk

"t2k.org" "voms02.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "t2k.org"

Filename: /etc/vomses/t2k.org-voms03.gridpp.ac.uk

"t2k.org" "voms03.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "t2k.org"

Notes: n/a


Virtual Organisation: UBOONE

Notes: n/a


Virtual Organisation: VIRGO

Filename: /etc/grid-security/vomsdir/virgo/voms.cnaf.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms.cnaf.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/virgo-voms.cnaf.infn.it

"virgo" "voms.cnaf.infn.it" "15009" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms.cnaf.infn.it" "virgo"

Notes: n/a


Virtual Organisation: VO.COMPLEX-SYSTEMS.EU

Filename: /etc/grid-security/vomsdir/vo.complex-systems.eu/voms2.hellasgrid.gr.lsc

/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr
/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016

Filename: /etc/vomses/vo.complex-systems.eu-voms2.hellasgrid.gr

"vo.complex-systems.eu" "voms2.hellasgrid.gr" "15160" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "vo.complex-systems.eu"

Notes: n/a


Virtual Organisation: VO.CTA.IN2P3.FR

Filename: /etc/grid-security/vomsdir/vo.cta.in2p3.fr/cclcgvomsli01.in2p3.fr.lsc

/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/vo.cta.in2p3.fr-cclcgvomsli01.in2p3.fr

"vo.cta.in2p3.fr" "cclcgvomsli01.in2p3.fr" "15008" "/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr" "vo.cta.in2p3.fr"

Notes: n/a


Virtual Organisation: VO.LANDSLIDES.MOSSAIC.ORG

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.landslides.mossaic.org-voms.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.landslides.mossaic.org"

Filename: /etc/vomses/vo.landslides.mossaic.org-voms02.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms02.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.landslides.mossaic.org"

Filename: /etc/vomses/vo.landslides.mossaic.org-voms03.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms03.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.landslides.mossaic.org"

Notes: n/a


Virtual Organisation: VO.MAGRID.MA

Filename: /etc/grid-security/vomsdir/vo.magrid.ma/voms.magrid.ma.lsc

/C=MA/O=MaGrid/OU=CNRST/CN=voms.magrid.ma
/C=MA/O=MaGrid/CN=MaGrid CA

Filename: /etc/vomses/vo.magrid.ma-voms.magrid.ma

"vo.magrid.ma" "voms.magrid.ma" "15001" "/C=MA/O=MaGrid/OU=CNRST/CN=voms.magrid.ma" "vo.magrid.ma"

Notes: n/a


Virtual Organisation: VO.MOEDAL.ORG

Filename: /etc/grid-security/vomsdir/vo.moedal.org/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/vo.moedal.org/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/vo.moedal.org-lcg-voms2.cern.ch

"vo.moedal.org" "lcg-voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "vo.moedal.org"

Filename: /etc/vomses/vo.moedal.org-voms2.cern.ch

"vo.moedal.org" "voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "vo.moedal.org"

Notes: n/a


Virtual Organisation: VO.NORTHGRID.AC.UK

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.northgrid.ac.uk-voms.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.northgrid.ac.uk"

Filename: /etc/vomses/vo.northgrid.ac.uk-voms02.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms02.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.northgrid.ac.uk"

Filename: /etc/vomses/vo.northgrid.ac.uk-voms03.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms03.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.northgrid.ac.uk"

Notes: n/a


Virtual Organisation: VO.SCOTGRID.AC.UK

Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.scotgrid.ac.uk-voms.gridpp.ac.uk

"vo.scotgrid.ac.uk" "voms.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.scotgrid.ac.uk"

Filename: /etc/vomses/vo.scotgrid.ac.uk-voms02.gridpp.ac.uk

"vo.scotgrid.ac.uk" "voms02.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.scotgrid.ac.uk"

Filename: /etc/vomses/vo.scotgrid.ac.uk-voms03.gridpp.ac.uk

"vo.scotgrid.ac.uk" "voms03.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.scotgrid.ac.uk"

Notes: n/a


Virtual Organisation: VO.SOUTHGRID.AC.UK

Filename: /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.southgrid.ac.uk-voms.gridpp.ac.uk

"vo.southgrid.ac.uk" "voms.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.southgrid.ac.uk"

Filename: /etc/vomses/vo.southgrid.ac.uk-voms02.gridpp.ac.uk

"vo.southgrid.ac.uk" "voms02.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.southgrid.ac.uk"

Filename: /etc/vomses/vo.southgrid.ac.uk-voms03.gridpp.ac.uk

"vo.southgrid.ac.uk" "voms03.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.southgrid.ac.uk"

Notes: n/a


Virtual Organisation: ZEUS

Filename: /etc/grid-security/vomsdir/zeus/grid-voms.desy.de.lsc

/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/zeus-grid-voms.desy.de

"zeus" "grid-voms.desy.de" "15112" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "zeus"

Notes: n/a



Not Listed in the EGI Operations Portal

Virtual Organisation: PLANCK

Filename: /etc/grid-security/vomsdir/planck/voms.cnaf.infn.it.lsc

/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it
/C=IT/O=INFN/CN=INFN Certification Authority

Filename: /etc/vomses/planck-voms.cnaf.infn.it

"planck" "voms.cnaf.infn.it" "15002" "/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it" "planck"


Notes: n/a

VO Resource Requirements

Please Note

Please do not change the table below as it is automatically updated from the EGI Operations Portal. Any changes you make will be lost.


VO Ram/Core MaxCPU MaxWall Scratch Other
alice 2000 1320 1500 10000
atlas 2048 5760 5760 20000 Additional runtime requirements:
  • at least 4GB of VM for each job slot

Software installation common items:

  • the full compiler suite (c/c++ and fortran) should be installed in the WNs, including all the compat-gcc-32* and the SL_libg2c.a_change packages in SL4-like nodes;
  • the reccommended version of the compilers is 3.4.6;
  • the f2c and libgfortran libraries (in both i386 and x86_64 versions, in case of x86_64 systems) are also required to run the software;
  • other libraries required are:
libpopt.so.0
libblas.so

Software installation setup (cvmfs sites):

Software installation requirements (non-cvmfs sites):

  • an experimental software area (shared filesystem) with at least 500 GB free and reserved for ATLAS.
biomed 100 1 1 100 For sites providing an SE, minimal required storage space is 1TB.
calice 2048 3600 5400 15000 CVMFS is used for the software distribution via:
/cvmfs/calice.desy.de

For setup instructions refer to:

http://grid.desy.de/cvmfs
cernatschool.org 0 0 0 0
cms 2000 2880 4320 20000 Note: CMS usually sends 8-core pilots, values for 'Multi Core' refer to that. Single-core pilots are discouraged.

Jobs require an address space larger than the memory size specified above. Sites should allow processes to use at least 6GB of virtual address space more per core than memory to accommodate the large amount of shared libraries used by jobs. (For a typical 8-core pilot that would translate into a VZSIZE limit of at least 64GB.)

Cloud resources should provision 8-core VMs to match standard 8-core pilots.

Input I/O requirement is an average 2.5 MB/s per thread from MSS.

All jobs need to have outbound connectivity.

Sites must not use pool accounts for the FQAN cms:/cms/Role=lcgadmin . For any other CMS job, sites need to use pool accounts so that at any time every grid credential is mapped to an independent local account.


National VOMS groups: In CMS national VOMS groups, e.g. /cms/becms or /cms/dcms, are used. Those proxies must be "supported" at all sites in the following way:

  • should be treated like /cms (base group), in case no special treated is wanted by the site
  • proxies with such national groups must be able to write to /store/user/temp (the PFN associated to this LFN)
comet.j-parc.jp 2048 1440 2880 40960
dteam 0 0 0 0
dune 0 2880 2880 10000
enmr.eu 8000 2880 4320 1000 1) For COVID-19 related jobs, slots with 8 GB/Core are required
  1. WeNMR software area must be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS and https://wiki.egi.eu/wiki/PROC22. Please do not forget to define on all WNs the environment variable VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu, as pointed out in the above documents.
  1. The line:

"/enmr.eu/*":::: has to be added to group.conf file before configuring via yaim the grid services. In the CREAM-CE this reflects in the lines: "/enmr.eu/*/Role=NULL/Capability=NULL" .enmr "/enmr.eu/*" .enmr of both the files /etc/grid-security/grid-mapfile and /etc/grid-security/voms-grid-mapfile, and in the lines: "/enmr.eu/*/Role=NULL/Capability=NULL" enmr "/enmr.eu/*" enmr of the file /etc/grid-security/groupmapfile. It is required to enable whatever VO group added for implementing per-application accounting.

epic.vo.gridpp.ac.uk 0 0 0 0
esr 2048 2100 0 0 Many applications only need part of the following. Java/Perl/Python/C/C++/FORTRAN77,-90,-95; IDL and MATLAB runtime; Scilab or Octave. Needs MPI for some applications.

Some applications require access to job output during execution, some even interaction via X11. 1 GB RAM; some applications need 3 GB RAM. Outbound connectivity from WN to databases. Shared file system needed for MPI applications, with about 10 GB of space. There are applications needing about 1000 simultaneously open files. Depending on application, output file sizes from some MB to 5 GB, for a total of several hundred thousand files. No permanent storage needed but transient and durable. Low-latency scheduling for short jobs needed.

fermilab 0 0 0 0
geant4 1000 650 850 300 Software is distributed via CernVM-FS

(http://cernvm.cern.ch/portal/filesystem), configuration should include geant4.cern.ch<http://geant4.cern.ch> and dependency (sft.cern.ch<http://sft.cern.ch>, grid.cern.ch<http://grid.cern.ch>) areas.

CernVM-FS needs to be accessed on WN. CernVM-FS Cache area needed is about 5GB.

gridpp 1000 1000 0 0
hyperk.org 0 1440 1440 10000
icecube 4000 2880 2880 40000 CVMFS is used for the software distribution via:

/cvmfs/icecube.opensciencegrid.org

ilc 2048 3600 5400 15000 CVMFS is used for the software distribution via:
/cvmfs/ilc.desy.de

For setup instructions refer to:

http://grid.desy.de/cvmfs
ipv6.hepix.org 0 0 0 0
lhcb 0 0 0 20000 Further recommendations from LHCb for sites:

The amount of memory in the field "Max used physical non-swap X86_64 memory size" of the resources section is understood to be the virtual memory required per single process of a LHCb payload. Usually LHCb payloads consist of one "worker process", consuming the majority of memory, and several wrapper processes. The total amount of virtual memory for all wrapper processes accounts for 1 GB which needs to be added as a requirement to the field "Max used physical non-swap X86_64 memory size" in case the virtual memory of the whole process tree is monitored.

The amount of space in field "Max size of scratch space used by jobs", shall be interpreted as 50 % each for downloaded input files and produced output files.

Sites should have the Centos7 or "Cern Centos7" operating system, or later versions, installed on their worker nodes. CPUs should support the x86_64_v2 instruction set (or later). Sites are requested to provide support for singularity containers and user namespaces. The latter can be checked by ensuring that /proc/sys/user/max_user_namespaces contains a large number.

The underlying OS should provide the libraries, binaries, and scripts required by the current HEP_OSlibs RPM meta package.

The shared software area shall be provided via CVMFS. LHCb uses the mount points

"/cvmfs/lhcb.cern.ch/",
"/cvmfs/lhcb-condb.cern.ch/",
"/cvmfs/lhcbdev.cern.ch/",
"/cvmfs/unpacked.cern.ch/",
"/cvmfs/cernvm-prod.cern.ch/",
on the worker nodes. 

Provisioning of a reasonable number of slots per disk server, proportional to the maximum number of concurrent jobs at the site.

Non T1 sites providing CVMFS, direct HTCondorCE, ARC, or CREAM submission and the requested amount of local scratch space will be considered as candidates for additional workloads (e.g. data reprocessing campaign).

Sites with disk storage must provide:

- an xroot endpoint (single DNS entry), at least for reading
- an HTTPS endpoint (single DNS entry), both read and write, supporting Third Party Copy
- a way to do the accounting (preferably following the WLCG TF standard: https://twiki.cern.ch/twiki/bin/view/LCG/StorageSpaceAccounting)

Sites with tape storage should be accessible from the other Tier1 and Tier2 sites. They should provide one of the supported WLCG tape systems (dCache or CTA). Tape classes to optimize data distribution is to be discusses on a per-site basis.

lsst 0 0 0 0 VO name must be "lsst" as it is an existing VO in OSG!

cf VOMS URL

lz 0 0 0 0
magic 1024 5000 0 0 Fortran77 and other compilers. See details in annex of MoU (documentation section).
mice 0 0 0 0
na62.vo.gridpp.ac.uk 2048 500 720 2048 VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.cern.ch

Need also access to /cvmfs/geant4.cern.ch and /cvmfs/sft.cern.ch

ops 0 0 0 0
pheno 0 0 0 0
skatelescope.eu 0 0 0 0
snoplus.snolab.ca 2000 1440 2160 20000 g++

gcc python-devel uuid-devel zlib-devel

SNO+ software area should be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS.

solidexperiment.org 0 0 0 0 will need to set up CVMFS.
t2k.org 1500 600 600 1000 t2k.org software should be mounted on WNs via CVMFS as defined at https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS
virgo 0 0 0 0
vo.complex-systems.eu 0 0 0 0
vo.cta.in2p3.fr 0 0 2000 0
vo.landslides.mossaic.org 0 0 0 0
vo.magrid.ma 0 0 0 0
vo.moedal.org 0 0 0 0
vo.northgrid.ac.uk 0 0 0 0
vo.scotgrid.ac.uk 0 0 0 0
vo.southgrid.ac.uk 0 0 0 0
zeus 2048 3600 5400 5000 CVMFS is used for the software distribution via:
/cvmfs/zeus.desy.de

For setup instructions refer to:

http://grid.desy.de/cvmfs


VO Activity

The VOs that are enabled at each site are listed in this VO table.

This page is a Key Document, and is the responsibility of Gerard Hand. It was last reviewed on 2024/04/23, 14:40:12 when it was considered to be 100% complete. It was last judged to be accurate on 2024/04/23.