Difference between revisions of "GridPP approved VOs"

From GridPP Wiki
Jump to: navigation, search
(Introduction)
(ENMR.EU & VIRGO VOMS Records Updates)
 
(69 intermediate revisions by 5 users not shown)
Line 1: Line 1:
 
 
== Introduction ==
 
== Introduction ==
  
The GridPP Project Management Board has agreed that up to 10 % of GridPP's processing capability should be allocated for non-LHC work. VOs that access the Grid like this must become [[Policies_for_GridPP_approved_VOs|Approved VOs]]; policies for managing them are described here:  [[Policies_for_GridPP_approved_VOs]].
+
The GridPP Project Management Board ('''PMB''') has agreed that up to 10 % of GridPP's processing capability should be allocated for non-LHC work. VOs that access the Grid like this must become Approved VOs; policies for managing approved VOs are described here:  [[Policies_for_GridPP_approved_VOs]].
  
The tables below indicate VOs that the GridPP PMB has approved, and the PMB encourages support for these VOs at all of its collaborating sites. Information about all European Grid Initiative (EGI), global and local VOs is given in the [http://operations-portal.egi.eu/ Operations portal] which is the main reference source for VO information (including VO manager, end-points, requirements etc.).
+
The tables below indicate VOs that the GridPP Project Management Board has approved, and the PMB encourages support for these VOs at all of its collaborating sites. Information about all European Grid Initiative ('''EGI'''), global and local VOs is given in the [http://operations-portal.egi.eu/ EGI Operations portal] which is the main reference source for VO information (including VO manager, end-points, requirements etc.).
  
'''Yum repository'''
+
<!--
 +
== Yum repository ==
  
RPM versions of the VOMS records for Approved VOs should be available via the [http://hep.ph.liv.ac.uk/~sjones/RPMS.voms/ VOMS RPMS Yum Repository]. The latest version, which is consistent with the records listed below, will be 1.15-1. Note that 1.15-1 is actually special, since it contains the new DN for the voms.fnal.gov server. See TB_SUPPORT, March 3, for more info.
+
RPM versions of the VOMS records for Approved VOs are available via the VOMS RPMS Yum Repository.
  
'''NOTA BENE'''
+
<div style="text-align:center">[http://hep.ph.liv.ac.uk/~sjones/RPMS.voms/ VOMS RPM Repository v1.16-1]
 +
</div>
 +
-->
  
Some sections in this document are automatically updated from the CIC Portal (approximately once a week). Please do not change the the vomsdir/ or vomses/ entries below, or the '''VO Resource Requirements''' sections.
+
 
 +
<div style="margin:auto; border:2px solid black;background-color:#EEEEEE;width:600px; max-width:97%">
 +
<div style="font-size:1.2em; font-weight:bold; padding-left:4px;background-color:#7C8AAF;color:#fff;">Please Note</div>
 +
<div style="padding:3px 6px">
 +
Please do not change the '''vomsdir/''' or '''vomses/''' entries or the '''VO Resource Requirements''' section below, as they are automatically updated from the EGI Operations Portal. Any changes you make will be lost!
 +
</div>
 +
</div>
 +
 
 +
<!--
 +
== Cleanup Campaign ==
  
 
* [[VO Cleanup Campaign]]
 
* [[VO Cleanup Campaign]]
 +
-->
  
==Approved EGI VOs==
+
==Approved VOs==
  
{|border="1" cellpadding="1"  
+
{|border="1" cellpadding="3" style="border-collapse:collapse;margin-bottom:40px;width:100%;"
|+Approved EGI VOs]  
+
<!-- |+Approved VOs] -->
 
|-style="background:#7C8AAF;color:white"
 
|-style="background:#7C8AAF;color:white"
 
!Name
 
!Name
 
!Area
 
!Area
 
!Contact
 
!Contact
 
+
<!-- start main approved list -->
 
+
 
|-
 
|-
|[http://aliceinfo.cern.ch alice]
+
|[https://alice-collaboration.web.cern.ch/ alice]
|LHC experiment at CERN
+
|The ALICE Collaboration is operating a dedicated heavy-ion detector to exploit the unique physics potential of nucleus-nucleus interactions at LHC energies. Our aim is to study the physics of strongly interacting matter at extreme energy densities, where the formation of a new phase of matter, the quark-gluon plasma, is expected.
|
+
|{{@|Latchezar.Betev|cern.ch}}<br>{{@|Maarten.Litmaath|cern.ch}}<br>{{@|costin.grigoras|cern.ch}}
  
 
|-
 
|-
|[http://atlas.web.cern.ch/Atlas/index.html atlas]
+
|[http://bes.ihep.ac.cn/ bes]
|LHC experiment at CERN
+
|Beijing Spectrometer (BES) is a general-purpose detector located in the interaction region of the BEPC storage ring, where the electron and positron beams collide.  The BES Collaboration consists of approximately 200 physicists and engineers from 27 institutions in 4 countries.
 
|
 
|
  
 
|-
 
|-
 
|[http://lsgc.org/biomed.html biomed]
 
|[http://lsgc.org/biomed.html biomed]
|Medical image processing and biomedical data processing
+
|This VO covers the areas related to health and life sciences. Currently, it is divided into 3 sectors: medical imaging, bioinformatics and drug discovery. The VO is openly accessible to academics, and to private company for non-commercial purposes.
|
+
|{{@|glatard|creatis.insa-lyon.fr}}<br>{{@|jerome.pansanel|iphc.cnrs.fr}}<br>{{@|sorina.pop|creatis.insa-lyon.fr}}<br>{{@|glatard|creatis.insa-lyon.fr}}
  
 
|-
 
|-
|[https://home.cern/science/experiments/cms cms]
+
|[https://twiki.cern.ch/twiki/bin/view/CALICE/ calice]
|LHC experiment at CERN
+
|CAlorimeter for the LInear Collider  Experiment
|
+
  
 +
A high granularity calorimeter optimised for the Particle Flow measurement of multi-jets final state at the International Linear Collider running at a center-of-mass between 90 GeV and 1 TeV.
 +
|{{@|thomas.hartmann|desy.de}}<br>{{@|andreas.gellrich|desy.de}}
  
 
|-
 
|-
|[http://wiki.egi.eu/wiki/Dteam_vo dteam]
+
|[http://cepc.ihep.ac.cn/ cepc]
|Default VO for EGI/NGI deployment
+
|The Circular Electron Positron Collider (CEPC) is a large international scientific facility proposed by the Chinese particle physics community in 2012
 
|
 
|
  
 
|-
 
|-
|esr
+
|[http://researchinschools.org/CERN/ cernatschool.org]
|Earth Science Research covering Solid Earth, Ocean, Atmosphere and their interfaces.
+
|The CERN@school VO represents the CERN@school project on the Grid. CERN@school aims to bring CERN technology into the classroom to aid with the teaching of physics and to inspire the next generation of scientists and engineers. The CERN@school VO will allow students and teachers involved with the project to harness GridPP to store and analyse data from the CERN@school detectors, the LUCID experiment and the associated GEANT4 simulations.
 
|
 
|
  
 
|-
 
|-
|[http://geant4.web.cern.ch/ geant4]
+
|[https://www.jlab.org/physics/hall-b/clas12 clas12]
|Geant4 is a Monte Carlo simulation toolkit which emulates the interactions of particles.
+
|
 
|
 
|
  
 
|-
 
|-
|[http://lhcb.web.cern.ch/lhcb/ lhcb]
+
|[http://wiki.grid.auth.gr/wiki/bin/view/ComplexityScienceSSC/VO vo.complex-systems.eu]
|LHC experiment at CERN
+
|The goal of the vo.complex-systems.eu is to promote the study of
|
+
complex systems and complex networks on the Grid infrastructure. The
 +
vo.complex-systems.eu Virtual Organization will also serve as the
 +
building layer of collaboration among international scientists
 +
focusing on the research area of Complexity Science.
 +
|{{@|romain.reuillon|iscpif.fr}}
  
 
|-
 
|-
|[https://magic.mpp.mpg.de/ magic]
+
|[http://comet.kek.jp comet.j-parc.jp]
|[http://wwwmagic.mppmu.mpg.de/ Gamma ray telescope] - Monte Carlo event production
+
|Muon-to-electron conversion experiment at J-PARC, which will be used by international COMET collaborators for design studies and data analysis. COMET will test Beyond-the-Standard-Model physics in a way that is complementary to the experiments at the LHC.
|
+
|{{@|daniela.bauer|imperial.ac.uk}}<br>{{@|Yoshi.Uchida|imperial.ac.uk}}<br>{{@|simon.fayer05|imperial.ac.uk}}
  
 
|-
 
|-
|[http://moedal.web.cern.ch vo.moedal.org]
+
|[https://confluence.egi.eu/display/EGIPP/DTEAM+VO dteam]
|The Monopole and Exotics Detector at LHC experiment at [http://home.cern CERN] - [https://operations-portal.egi.eu/vo/view/voname/vo.moedal.org VO ID card].
+
|The goal of the VO is to facilitate the deployment of a stable production Grid infrastructure. To this end, members of this VO (who have to be associated with a registered site and be involved in its operation) are allowed to run tests to validate the correct configuration of their site. Site performance evaluation and/or monitoring programs may also be run under the DTEAM VO with the approval of the Site Manager, subject to the agreement of the affected sites' management.
|Jonathan Hays (QMUL)
+
|{{@|kkoum|admin.grnet.gr}}<br>{{@|alessandro.paolini|egi.eu}}<br>{{@|matthew.viljoen|egi.eu}}<br>{{@|kyrginis|admin.grnet.gr}}
  
 
|-
 
|-
|planck
+
|[http://www.wenmr.eu enmr.eu]
|Satellite project for mapping Cosmic Microwave Background
+
|Structural biology and life sciences in general, and NMR in particular, have always been associated with advanced computing. The current challenges in the post-genomic era call for virtual research platforms that provide the worldwide research community with both user-friendly tools, platforms for data analysis and exchange, and an underlying e-infrastructure. WeNMR groups different research teams into a worldwide virtual research community. It builds on the established eNMR e-Infrastructure and its steadily growing virtual organization, which is currently the second largest VO in the area of life sciences. WeNMR provides an e-Infrastructure platform and Science Gateway for structural biology towards EGI for the users of existing infrastructures. It involves researchers from around the world and will build bridges to other areas of structural biology. Integration with SAXS, a rapidly growing and highly complementary method, is directly included in WeNMR, but links will also be established to related initiatives. WeNMR will serve all relevant INSTRUCT communities in line with the ESFRI roadmap.
|
+
|{{@|Marco.Verlato|pd.infn.it}}<br>{{@|a.m.j.j.bonvin|uu.nl}}<br>{{@|rosato|cerm.unifi.it}}<br>{{@|giachetti|cerm.unifi.it}}<br>{{@|verlato|infn.it}}
  
 
|-
 
|-
|[http://www.t2k.org t2k.org]
+
|[http://www.sruc.ac.uk/epic/ epic.vo.gridpp.ac.uk]
|Next Generation Long Baseline Neutrino Oscillation Experiment
+
|EPIC replaces an earlier EPIC project that was focused upon Veterinary Surveillance (Phase 1). This new consortium EPIC project aims to become a world leader in policy linked research and includes some of Scotland’s leading veterinary epidemiologists and scientists.
|Ben Still (QMUL)
+
  
 +
The overarching purpose for the Centre is to provide access to high quality advice and analyses on the epidemiology and control of animal diseases that are important to Scotland, and to best prepare Scotland for the next major disease incursion. Ultimately, this strategic advice to the Scottish Government will help ensure that the interests of the various stakeholders involved in disease emergency planning and response are met as effectively as possible. This all must be achieved within the context of our rapidly changing environment. For example, issues such as climate change are now influencing the livestock disease risks that Scotland faces.
 +
|{{@|thomas.doherty|glasgow.ac.uk}}
  
 
|-
 
|-
|[https://wiki.egi.eu/wiki/OPS_vo ops]
+
|[http://www.euearthsciencegrid.org/content/esr-vo-introduction esr]
|The OPS VO is an infrastructure VO that MUST be enabled by all EGI Resource Centres that support the VO concept
+
|The Earth Science Research covers research in the fields of Solid Earth, Ocean, Atmosphere and their interfaces. A large variety of communities correspond to each domain, some of them covering several domains.
|
+
|}
+
  
== Approved Global VOs==
 
  
  
{|border="1" cellpadding="1"
+
In the ESR Virtual Organization (ESR-VO) four domains are represented:
|+Approved Global VOs
+
|-style="background:#7C8AAF;color:white"
+
!Name
+
!Area
+
!Contact
+
  
|-
+
  1. Earth Observation
|[[SKA Regional Centre|skatelescope.eu]]
+
|SKA European regional data centre [https://operations-portal.egi.eu/vo/view/voname/skatelescope.eu VO ID card]
+
|Andrew McNab (Man)
+
  
|-
+
  2. Climate
|[http://polywww.in2p3.fr/activites/physique/flc/calice.html calice]
+
|CAlorimeter for the LInear Collider  Experiment
+
|Roman  Poeschl
+
  
 +
  3. Hydrology
 +
 +
  4. Solid Earth Physics
 +
|{{@|andre.gemuend|scai.fraunhofer.de}}<br>{{@|weissenb|ccr.jussieu.fr}}<br>{{@|weissenb|ccr.jussieu.fr}}<br>{{@|weissenb|ccr.jussieu.fr}}
  
 
|-
 
|-
|[http://www-flc.desy.de/flc/ ilc]
+
|[http://www.fnal.gov fermilab]
|International Linear Collider project (future electron-positron linear collider studies)
+
|Fermilab Virtual Organization (VO) - The Fermilab VO is an "umbrella" VO that includes the Fermilab Campus Grid (FermiGrid) and Fermilab Grid Testing (ITB) infrastructures, and all Fermilab computing activities that are not big enough to have their own Virtual Organization. Broadly these include the intensity frontier program, theoretical simulations, fixed target analysis, and accelerator and beamline design as well as activities performed by the Fermilab Campus Grid administrators.
|
+
|{{@|garzoglio|fnal.gov}}<br>{{@|boyd|fnal.gov}}
  
 
|-
 
|-
|[https://icecube.wisc.edu/ icecube]
+
|[http://geant4.web.cern.ch/geant4/ geant4]
|Neutrino experiment at the South Pole (Astronomy, Astrophysics and Astro-Particle Physics)
+
|Geant4 is a toolkit for the simulation of the passage of particles through matter. Its areas of application include high energy, nuclear and accelerator physics, as well as studies in medical and space science. The two main reference papers for Geant4 are published in Nuclear Instruments and Methods in Physics Research  A 506 (2003) 250-303, and IEEE Transactions on Nuclear Science 53 No. 1 (2006) 270-278.
|Damian Pieloth, Alessandra Forti
+
|{{@|Andrea.Sciaba|cern.ch}}<br>{{@|Andrea.Sciaba|cern.ch}}<br>{{@|Andrea.Dotti|cern.ch}}
  
 
|-
 
|-
|[http://www-microboone.fnal.gov/ microboone]
+
|[http://www.gridpp.ac.uk gridpp]
|Low energy neutrino (Fermilab)
+
|GridPP is a collaboration of particle physicists and computer scientists from the UK and CERN. They are building a distributed computing Grid across the UK for particle physicists. At the moment there is a working particle physics Grid across 17 UK institutions.
|
+
|{{@|m.doidge|lancaster.ac.uk}}
  
 
|-
 
|-
|[http://www-zeus.desy.de/ zeus]
+
|[http://www.hyperk.org hyperk.org]
|Particle physics experiment on DESY's electron-proton collider (HERA)
+
|We propose the Hyper-Kamiokande (Hyper-K) detector as a next generation underground water Cherenkov detector. It will serve as a far detector of a long baseline neutrino oscillation experiment envisioned for the upgraded J-PARC, and as a detector capable of observing -- far beyond the sensitivity of the Super-Kamiokande (Super-K) detector -- proton decays, atmospheric neutrinos, and neutrinos from astronomical origins. The baseline design of Hyper-K is based on the highly successful Super-K, taking full advantage of a well-proven technology. "
|
+
|{{@|C.J.Walker|qmul.ac.uk}}<br>{{@|francesca.di_lodovico|kcl.ac.uk}}
  
 
|-
 
|-
|[http://na62.web.cern.ch/NA62/ na62.vo.gridpp.ac.uk]
+
|[http://www.icecube.wisc.edu/ icecube]
|Another CP violation experiment at CERN
+
|The goal of the VO is to enable the usage of Grid resources for ICECUBE collaboration members, mainly for simulation and reconstruction.
|[http://www.ppe.gla.ac.uk/~protopop/ Dan Protopopescu] ([http://www.gla.ac.uk Uni. Glasgow])
+
|{{@|thomas.hartmann|desy.de}}<br>{{@|andreas.gellrich|desy.de}}<br>{{@|andreas.haupt|desy.de}}
  
 
|-
 
|-
|ipv6.hepix.org
+
|[http://www-flc.desy.de/ ilc]
|Testing of IPv6 of the middleware, applications and tools (HEP, EGI, middleware technology providers and other infrastructures used by WLCG).
+
|VO for the International Linear Collider Community.
|Chris Walker, Dave Kelsey
+
|{{@|thomas.hartmann|desy.de}}<br>{{@|andreas.gellrich|desy.de}}<br>{{@|Christoph.Wissing|desy.de}}
  
 
|-
 
|-
|[http://www.lsst.org/lsst/ lsst] [https://www.gridpp.ac.uk/wiki/LSST_UK LSST UK]
+
|[https://voms2.cnaf.infn.it:8443/voms/ipv6.hepix.org/admin/home.action ipv6.hepix.org]
|Large Synoptic Survey Telescope
+
|The goal of the VO is to carry out testing of IPv6 readiness, functionality and performance of the middleware, applications and tools required by the stakeholder communities, especially HEP. Other authorised activities include use of the testbed by related IPv6 activities inside EGI, the related middleware technology providers and other Infrastructures used by WLCG/HEP.
|Gabriele Garzoglio, Iain Goodenow
+
|{{@|david.kelsey|stfc.ac.uk}}
  
 +
|-
 +
|[http://lz.lbl.gov/ lz]
 +
|This VO will support LUX Zeplin experiment designed to search Dark Matter.
 +
|{{@|E.Korolkova|sheffield.ac.uk}}<br>{{@|j.dobson|ucl.ac.uk}}
  
 +
|-
 +
|[http://magic.mppmu.mpg.de magic]
 +
|MAGIC is a system of two imaging atmospheric Cherenkov telescopes (or IACTs). MAGIC-I started routine operation after commissioning in 2004. Construction of MAGIC-II was completed in early 2009, and the two telescopes have been in operation ever since, with a break in 2012 for an upgrade that achieved full homogeneity. The project is funded primarily by the funding agencies BMFB (Germany), MPG (Germany), INFN (Italy), MICINN(Spain), and the ETH Zurich (Switzerland).
 +
|{{@|neissner|pic.es}}<br>{{@|contrera|gae.ucm.es}}<br>{{@|rfirpo|pic.es}}
  
 +
|-
 +
|[http://www.magrid.ma vo.magrid.ma]
 +
|VO vo.magrid.ma is a multidisciplinary VO providing general grid services and support to Moroccan scientific community
 +
|{{@|rahim|cnrst.ma}}
  
 +
|-
 +
|[http://www.mice.iit.edu/ mice]
 +
|A VO to support the activities of the Muon Ionisation Cooling Experiment (MICE). Specifically it is to enable the moving of MICE data around the Grid followed by the submission of analysis to these data. This is expected to be a small VO.
 +
|{{@|d.colling|imperial.ac.uk}}<br>{{@|p.hodgson|sheffield.ac.uk}}<br>{{@|daniela.bauer|imperial.ac.uk}}<br>{{@|janusz.martyniak|imperial.ac.uk}}
  
|}
+
|-
 
+
|[https://microboone.fnal.gov/ uboone]
== Approved Local VOs==
+
|MicroBooNE is a large 170-ton liquid-argon time projection chamber (LArTPC) neutrino experiment located on the Booster neutrino beamline at Fermilab
 
+
|
{|border="1" cellpadding="1"
+
|+Approved Local VOs
+
 
+
|-style="background:#7C8AAF;color:white"
+
!Name
+
!Area
+
!Contact
+
  
 +
|-
 +
|[https://www.psi.ch/en/mu3e mu3e]
 +
|The Mu3e experiment is a new search for the lepton-flavour violating decay of a positive muon into two positrons and one electron.
 +
|
  
 
|-
 
|-
|[http://www.imperial.ac.uk/high-energy-physics/research/experiments/solid/ solidexperiment.org]  
+
|[https://na62.gla.ac.uk/ na62.vo.gridpp.ac.uk]
|The SoLid experiment
+
|The NA62 VO (na62.vo.gridpp.ac.uk) is meant to provide grid computing and data storage resources to the NA62 collaboration. The NA62 VO is supported by University of Cambridge, University of Glasgow, Imperial College London, University of Birmingham, University of Lancaster, University of Liverpool, University of Manchester, Oxford University and RAL (from UK), CERN, CNAF (Italy) and UCL (Belgium). More info about the NA62 experiment can be found on http://na62.web.cern.ch/na62/. The production portal is located at http://na62.gla.ac.uk/
|Daniela Bauer
+
|{{@|Dan.Protopopescu|glasgow.ac.uk}}<br>{{@|David.Britton|glasgow.ac.uk}}
  
 
|-
 
|-
|[http://www.gridpp.ac.uk/ gridpp] ([https://www.gridpp.ac.uk/wiki/Start_Here_-_Joining_a_Regional_VO join])
+
|[https://wiki.egi.eu/wiki/OPS_vo ops]
|GridPP is a collaboration of particle physicists and computer scientists from the UK and CERN
+
|The goal of the VO is to facilitate the operations of the LCG/EGI infrastructure, which includes running official monitoring, re-certification and performance evaluation tools. Additionally the VO will be used for interoperations with other grid infrastructures.
|Jeremy Coles (Cambridge)
+
|{{@|eimamagi|srce.hr}}<br>{{@|alessandro.paolini|egi.eu}}
  
 
|-
 
|-
 
|[http://www.phenogrid.dur.ac.uk/ pheno]
 
|[http://www.phenogrid.dur.ac.uk/ pheno]
|A collaboration of UK Particle Physics Phenomenologists who are developing applications for the LHC
+
|Phenogrid is the VO for UK theorists that don't fit within one of the LHC experiments (e.g. developers of Monte Carlos etc.) The rest of this text exists only to satisfy the extremely unnecessary minimum limit of 200 characters.
|David Grellscheid (Durham)
+
|{{@|jeppe.andersen|durham.ac.uk}}<br>{{@|adam.j.boutcher|durham.ac.uk}}<br>{{@|paul.clark|durham.ac.uk}}
 
+
 
+
|-
+
|[http://mice.iit.edu  mice]
+
|A neutrino factory experiments
+
|Paul Hodgson (Sheffield)
+
 
+
  
 
|-
 
|-
 
|[https://snoplus.phy.queensu.ca/ snoplus.snolab.ca]
 
|[https://snoplus.phy.queensu.ca/ snoplus.snolab.ca]
| A  Diverse Instrument for Neutrino Research within the SNOLAB Underground facility
+
|VO for the snoplus experiment, a multi-purpose liquid scintillator neutrino experiment based in Sudbury, Canada. Members of the snoplus virtual organisation will contribute to the European computing effort to accurately simulate the SNOplus detector response.
|Jeanne Wilson (QMU)
+
|{{@|Jeanne.wilson|kcl.ac.uk}}<br>{{@|C.J.Walker|qmul.ac.uk}}<br>{{@|m.mottram|qmul.ac.uk}}
  
 
|-
 
|-
|[http://operations-portal.egi.eu/vo/view/voname/vo.northgrid.ac.uk vo.northgrid.ac.uk]
+
|[http://www.imperial.ac.uk/high-energy-physics/research/experiments/solid/ solidexperiment.org]
|Regional VO to allow access to HEP resources to different local disciplines.
+
|support grid user of the SoLid experiment.
|[mailto:Alessandra.Forti@cern.ch Alessandra Forti]
+
|{{@|daniela.bauer|imperial.ac.uk}}<br>{{@|antonin.vacheret|imperial.ac.uk}}
  
 
|-
 
|-
|[http://operations-portal.egi.eu/vo/view/voname/vo.scotgrid.ac.uk vo.scotgrid.ac.uk]  
+
|[http://www.t2k.org t2k.org]
|The VO is for academic and other users in the ScotGrid region to test access to EGI and GridPP resources.  
+
|T2K is a neutrino experiment designed to investigate how neutrinos change from one flavour to another as they travel (neutrino oscillations). An intense beam of muon neutrinos is generated at the J-PARC nuclear physics site on the East coast of Japan and directed across the country to the Super-Kamiokande neutrino detector in the mountains of western Japan. The beam is measured once before it leaves the J-PARC site, using the near detector ND280, and again at Super-K: the change in the measured intensity and composition of the beam is used to provide information on the properties of neutrinos.
|[mailto:david.crooks@glasgow.ac.uk David Crooks]
+
|{{@|sophie.king|kcl.ac.uk}}<br>{{@|tomislav.vladisavljevic|stfc.ac.uk}}
  
 
|-
 
|-
|[http://operations-portal.egi.eu/vo/view/voname/vo.southgrid.ac.uk vo.southgrid.ac.uk]  
+
|[http://wwwcascina.virgo.infn.it/ virgo]
|The VO is for academic and other users in the SouthGrid region to test access to EGI resources.  
+
|Scientific target: detection of gravitational waves. Gravitational waves are predicted by the General Theory of Relativity but still not directly detected due to their extremely weak interaction with matter. Large interferometric detectors, like Virgo, are operating with the aim of directly detecting gravitational signals from various astrophysical sources. Signals are expected to be deeply buried into detector noise and suitable data analysis algorithm are developed in order to allow detection and signal parameter estimation. For many kind of searches large computing resources are needed and in some important cases we are computationally bound: the larger is the available computing power and the wider is the portion of source parameter space that can be explored.  
|[mailto:P.Gronbech1@physics.ox.ac.uk Peter Gronbech]
+
 
 +
VO target: to allow data management and computationally intensive data analysis
 +
|{{@|cristiano.palomba|roma1.infn.it}}<br>{{@|alberto.colla|roma1.infn.it}}
  
 
|-
 
|-
|epic.vo.gridpp.ac.uk
+
|[http://mossaic.org/ vo.landslides.mossaic.org]
|Veterinary epidemiology in Scotland
+
|A virtual organisation for landslide modellers associated with the Management of Slope Stability in Communities (MoSSaiC) project. The VO is used for running landslide modelling software such as CHASM and QUESTA.
|Thomas Doherty
+
|{{@|l.kreczko|bristol.ac.uk}}
  
 
|-
 
|-
|[http://www.hyperk.org hyperk.org] ([https://voms.gridpp.ac.uk:8443/voms/hyperk.org join])
+
|[http://moedal.org vo.moedal.org]
|The Hyper-Kamiokande experiment
+
|The MoEDAL VO allows members of the MoEDAL Collaboration to perform all of the computing activities relevant for the MoEDAL experiment, making use of available resources according to the policy defined by the Collaboration.
|Christopher Walker, Francesca di lodovico
+
|{{@|t.whyntie|qmul.ac.uk}}<br>{{@|daniel.felea|cern.ch}}
 
+
  
 
|-
 
|-
|[http://cernatschool.web.cern.ch cernatschool.org] ([https://voms.gridpp.ac.uk:8443/voms/cernatschool.org join])
+
|[https://voms.gridpp.ac.uk:8443/voms/vo.northgrid.ac.uk vo.northgrid.ac.uk]
|The [[CERN@school]] project.
+
|Regional Virtual Organisation created to allow access to HEP resources to other local disciplines from Northgrid sites: Manchester, Lancaster, Liverpool, Sheffield. Users from these universities can apply.
|Steve Lloyd (QML), [http://pprc.qmul.ac.uk/directory/t.whyntie Tom Whyntie] (QML, Langton Star Centre)
+
|{{@|alessandra.forti|cern.ch}}<br>{{@|robert.frank|manchester.ac.uk}}<br>{{@|robert.frank|manchester.ac.uk}}
 
+
  
 
|-
 
|-
|
+
|[http://www.scotgrid.ac.uk/ vo.scotgrid.ac.uk]
|'''The VOs below are not in the CIC Portal data'''
+
|The VO is for academic and other users in Scotland to test access to EGI resources. Users will join this VO before deciding whether to setup one of their own for long term access. It is also designed as a test VO to allow maintenance and operational testing of site services.
|
+
|{{@|garth.roy|glasgow.ac.uk}}
  
 
|-
 
|-
|earthsci.vo.gridpp.ac.uk
+
|[http://www.southgrid.ac.uk/VO/ vo.southgrid.ac.uk]
|TBD
+
|The VO is for academic and other users in the SouthGrid (UKI-SOUTHGRID-BHAM-HEP,UKI-SOUTHGRID-BRIS-HEP,UKI-SOUTHGRID-CAM-HEP,UKI-SOUTHGRID-OX-HEP,UKI-SOUTHGRID-RALPP, UKI-SOUTHGRID-SUSX) region to test access to EGI resources. Users will join this VO before deciding
|TBD
+
whether to setup one of their own for long term access.
 +
|{{@|pete.gronbech|physics.ox.ac.uk}}
  
 
|-
 
|-
|*
+
|[http://www-zeus.desy.de/ zeus]
|*
+
|ZEUS is a collaboration of about 450 physicists who are running a large particle detector at the electron-proton collider HERA at the DESY laboratory in Hamburg. The ZEUS detector is a sophisticated tool for studying the particle reactions provided by the high-energetic beams of the HERA accelerator. Thus the participating scientists are pushing forward our knowledge of the fundamental particles and forces of nature, gaining unsurpassed insight into the exciting laws of the microcosm.
|*
+
|{{@|thomas.hartmann|desy.de}}<br>{{@|andreas.gellrich|desy.de}}
 
+
<!-- end main approved list -->
 
|}
 
|}
  
== Other VOs ==
+
== IRIS Partners ==
 
+
This area can be used to record information about VOs that are site specific or localised in a region. This section can be used to advertise a local VO that you would like supported elsewhere.
+
 
+
{|border="1" cellpadding="1"
+
|+Other VOs
+
  
 +
{|border="1" cellpadding="3" style="border-collapse:collapse;margin-bottom:40px;width:100%;"
 +
<!-- |+Other VOs -->
 
|-style="background:#7C8AAF;color:white"
 
|-style="background:#7C8AAF;color:white"
 
!Name
 
!Name
 
!Area
 
!Area
 
!Contact
 
!Contact
 
+
<!-- Not recorded as active by QMUL
 
+
 
+
 
|-
 
|-
|vo.landslides.mossaic.org
+
| CCFE
|The landslides VO belongs to the Mossaic project (http://www.bristol.ac.uk/geography/research/hydrology/research/slope/mossiac//).
+
|
|Luke Kreczko (L.Kreczko@bristol.ac.uk)
+
|
 
+
 
|-
 
|-
|enmr.eu
+
| CCP4
|unk
+
|
|unk
+
|
 
+
|-
 +
| CASU
 +
|
 +
|
 +
|-
 +
| LIGO
 +
|
 +
|
 +
|-
 +
| Central Laser Facility
 +
|
 +
|
 +
|-
 +
| Gaia
 +
|
 +
|
 +
|-
 +
| ISIS
 +
|
 +
|
 +
|-
 +
| ALMA
 +
|
 +
|
 +
|-
 +
| MERLIN
 +
|
 +
|
 
|-
 
|-
|none
+
| EUCLID
|none
+
|
|none
+
|
 
+
|-
|}
+
| diamond
 
+
|
== Approved VOs being established into GridPP infrastructure ==
+
|
 
+
|-
As part of its commitment to various projects, the GridPP PMB has approved the establishment of the following VOs (your site can not yet support these but when the VO is setup and functioning we will let you know.)
+
| LZ UK
 
+
|
{|border="1" cellpadding="1"
+
|
|+VOs being established
+
|-
 
+
| WFAU
|-style="background:#7C8AAF;color:white"
+
|
!Name
+
|
!Area
+
-->
!Contact
+
<!-- start iris approved list -->
 +
|-
 +
|[https://atlas.cern/ atlas]
 +
|The ATLAS VO allow the members of the ATLAS collaboration to perform all the computing activities relevant for the ATLAS experiment, making use of the available resources following the policy defined by the Collaboration.
 +
|{{@|Alessandro.DeSalvo|roma1.infn.it}}<br>{{@|Elisabetta.Vilucchi|lnf.infn.it}}<br>{{@|jd|bnl.gov}}<br>{{@|james.william.walder|cern.ch}}
  
 
|-
 
|-
| LZ
+
|[http://cms.cern.ch/iCMS/ cms]
| LZ Dark Matter Experiment
+
|The Compact Muon Solenoid (CMS) experiment is a large general-purpose particle physics detectors built on the proton-proton Large Hadron Collider (LHC) at CERN in Switzerland.
| Daniela Bauer, Elena Korolkova, Dan Bradley
+
|{{@|Andreas.Pfeiffer|cern.ch}}<br>{{@|stefano.belforte|cern.ch}}<br>{{@|stefano.belforte|ts.infn.it}}<br>{{@|Daniele.Bonacorsi|bo.infn.it}}<br>{{@|Christoph.Wissing|desy.de}}<br>{{@|sexton|gmail.com}}<br>{{@|lammel|fnal.gov}}<br>{{@|jose.hernandez|ciemat.es}}<br>{{@|Daniele.Bonacorsi|bo.infn.it}}<br>{{@|gutsche|fnal.gov}}<br>{{@|Andrea.Sciaba|cern.ch}}
  
 
|-
 
|-
| supernemo.org
+
|[https://portal.cta-observatory.org/Pages/Home.aspx vo.cta.in2p3.fr]
| Searching for Neutrinoless Double Beta Decay
+
|Monte Carlo simulations production and analysis for the "CTA - Cherenkov Telescopes Array"
| Ben Morgan, Jens Jensen, Paolo Franchini
+
international consortium.
 +
|{{@|cecile.barbier|lapp.in2p3.fr}}<br>{{@|arrabito|in2p3.fr}}
  
 
|-
 
|-
|fermilab
+
|[http://www.dunescience.org dune]
|Umbrella VO for Fermilab
+
|DUNE is the Deep Underground Neutrino Experiment managed by the global DUNE collaboration and hosted at Fermilab.  We are building a deep-underground Liquid-Argon based neutrino detector to study accelerator-based neutrino oscillations, supernova neutrinos, and nucleon decay.
|Gabriele Garzoglio, Alessandra Forti
+
|{{@|andrew.mcnab|cern.ch}}<br>{{@|timm|fnal.gov}}
  
 
|-
 
|-
|dune
+
|[http://lhcb.web.cern.ch/lhcb/ lhcb]
|Deep Underground Neutrino Experiment
+
|The LHCb (Large Hadron Collider Beauty) experiment is mainly set on finding the solution to the mistery of the matter-antimatter imbalance in the Universe.
|Elena Korelkova
+
|{{@|andrew.mcnab|cern.ch}}<br>{{@|concezio.bozzi|cern.ch}}<br>{{@|christophe.denis.haen|cern.ch}}<br>{{@|jan.van.eldik|cern.ch}}<br>{{@|joel.closier|cern.ch}}<br>{{@|ben.couturier|cern.ch}}<br>{{@|joel.closier|cern.ch}}
  
 
|-
 
|-
|
+
|[http://www.lsst.org/lsst/ lsst]
|'''The VOs below are not yet fully synced from the CIC Portal data'''
+
|Large Synoptic Survey Telescope or LSST is a large aperture wide field survey telescope and 3200 Megapixel camera to image faint astronomical objects, rapidly scan the sky and observe probes for dark matter and dark enegy. LSST Data Management and Simulation jobs will run on OSG and EGI.
|
+
|{{@|boutigny|in2p3.fr}}<br>{{@|IGoodenow|lsst.org}}<br>{{@|fabio|in2p3.fr}}<br>{{@|yangw|SLAC.stanford.edu}}<br>{{@|kherner|fnal.gov}}
  
 
|-
 
|-
|lz
+
|[https://www.skatelescope.org/the-ska-project/ skatelescope.eu]
|One VOMS server, at Imperial, is added by hand.
+
|The Square Kilometre Array (SKA) project is an international effort to build the world’s largest radio telescope, with eventually over a square kilometre (one million square metres) of collecting area. The scale of the SKA represents a huge leap forward in both engineering and research & development towards building and delivering a unique instrument, with the detailed design and preparation now well under way. As one of the largest scientific endeavours in history, the SKA will bring together a wealth of the world’s finest scientists, engineers and policy makers to bring the project to fruition.
|*
+
The vo skatelescope.eu is the vo supporting this project.
 +
|{{@|alessandra.forti|cern.ch}}<br>{{@|andrew.mcnab|cern.ch}}<br>{{@|rohini.joshi|manchester.ac.uk}}
  
 
|-
 
|-
|dune
+
|[https://eucliduk.net/ eucliduk.net]
|All VOMS servers added by hand
+
|The Euclid mission aims at understanding why the expansion of the Universe is accelerating and what is the nature of the source responsible for this acceleration which physicists refer to as dark energy.
|*
+
|{{@|msh|roe.ac.uk}}
 
+
<!-- end iris approved list -->
 
+
|}
  
 +
== Approved VOs being established into GridPP infrastructure ==
  
 +
As part of its commitment to various projects, the GridPP PMB has approved the establishment of the following VOs (your site can not yet support these but when the VO is setup and functioning we will let you know.)
  
 +
{|border="1" cellpadding="3" style="border-collapse:collapse;margin-bottom:40px;width:100%;"
 +
<!-- |+VOs being established  -->
 +
|-style="background:#7C8AAF;color:white"
 +
!Name
 +
!Area
 +
!Contact
 +
<!-- start new approved list -->
 +
<!-- end new approved list -->
 
|}
 
|}
  
Line 331: Line 382:
 
The table below comprises a history of VOs that have been removed from the approved list for various reasons.
 
The table below comprises a history of VOs that have been removed from the approved list for various reasons.
  
{|border="1" cellpadding="1"
+
{|border="1" cellpadding="3" style="border-collapse:collapse;margin-bottom:40px;width:100%;"
|+VOs that have been removed
+
<!-- |+VOs that have been removed -->
  
 
|-style="background:#7C8AAF;color:white"
 
|-style="background:#7C8AAF;color:white"
Line 338: Line 389:
 
!Date of removal
 
!Date of removal
 
!Notes
 
!Notes
 
|-
 
|vo.londongrid.ac.uk
 
|in progress [[https://ggus.eu/?mode=ticket_info&ticket_id=129065 GGUS]]
 
|VO not used any more
 
 
|-
 
|[https://voms.egi.cesga.es:8443/voms/fusion/register/start.action fusion]
 
|30 Jan 2017
 
|Discussion with Rubén Vallés Pérez. VO appears defunct.
 
 
|-
 
|superbvo.org
 
|19 Jan 2016
 
|Discussed at Ops Meeting. Defunct.
 
 
|-
 
|hone
 
|24 Nov 2015
 
|Discussed at Ops Meeting. Defunct.
 
 
|-
 
|vo.sixt.cern.ch
 
|11 Nov 2015
 
|No members, no voms servers, defunct
 
  
 
|-
 
|-
Line 370: Line 396:
  
 
|-
 
|-
|camont.gridpp.ac.uk
+
|camont
|9 Oct 2013
+
|7th June 2017
 
|none
 
|none
  
 
|-
 
|-
|camont
+
|camont.gridpp.ac.uk
|7th June 2017
+
|9 Oct 2013
 
|none
 
|none
  
Line 393: Line 419:
 
|7th June 2017
 
|7th June 2017
 
|none
 
|none
 +
 +
|-
 +
|[https://voms.egi.cesga.es:8443/voms/fusion/register/start.action fusion]
 +
|30 Jan 2017
 +
|Discussion with Rubén Vallés Pérez. VO appears defunct.
 +
 +
|-
 +
|hone
 +
|24 Nov 2015
 +
|Discussed at Ops Meeting. Defunct.
  
 
|-
 
|-
Line 421: Line 457:
  
 
|-
 
|-
|totalep
+
|superbvo.org
|9 Oct 2013
+
|19 Jan 2016
|none
+
|Discussed at Ops Meeting. Defunct.
  
|supernemo.vo.eu-egee.org  
+
|-
 +
|supernemo.vo.eu-egee.org
 
|24 Feb 2020
 
|24 Feb 2020
 
|now called supernemo.org
 
|now called supernemo.org
  
|}
+
|-
 +
|totalep
 +
|9 Oct 2013
 +
|none
  
 +
|-
 +
|vo.londongrid.ac.uk
 +
|in progress [[https://ggus.eu/?mode=ticket_info&ticket_id=129065 GGUS]]
 +
|VO not used any more
  
 +
|-
 +
|vo.sixt.cern.ch
 +
|11 Nov 2015
 +
|No members, no voms servers, defunct
 +
 +
|}
  
 
== Example site-info.def entries ==
 
== Example site-info.def entries ==
Line 437: Line 487:
 
The examples of site-info.def entries for yaim have been moved: [[ExampleSiteinfoDefEntries|Example site-info.def entries]]
 
The examples of site-info.def entries for yaim have been moved: [[ExampleSiteinfoDefEntries|Example site-info.def entries]]
  
 
+
<div style="margin:auto; border:2px solid black;background-color:#EEEEEE;width:600px; max-width:97%">
'''NOTA BENE'''
+
<div style="font-size:1.2em; font-weight:bold; padding-left:4px;background-color:#7C8AAF;color:#fff;">Please Note</div>
 
+
<div style="padding:3px 6px">
Please do not change the vomsdir/ or vomses/ entries below, as they are automatically updated from the CIC Portal.  
+
Please do not change the '''vomsdir/''' or '''vomses/''' entries below, as they are automatically updated from the EGI Operations Portal.
 
+
Any changes you make will be lost!
 
+
</div>
<!-- START OF SIDSECTION -->
+
</div>
  
  
{{BOX VO|ALICE|<!-- VOMS RECORDS for ALICE -->
+
<!-- START OF SIDSECTION -->{{BOX VO|ALICE|<!-- VOMS RECORDS for ALICE -->
 
''' Filename: ''' /etc/grid-security/vomsdir/alice/lcg-voms2.cern.ch.lsc
 
''' Filename: ''' /etc/grid-security/vomsdir/alice/lcg-voms2.cern.ch.lsc
 
<pre><nowiki>
 
<pre><nowiki>
Line 456: Line 506:
 
<pre><nowiki>
 
<pre><nowiki>
 
/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
 
/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/alice/voms-alice-auth.app.cern.ch.lsc
 +
<pre><nowiki>
 +
/DC=ch/DC=cern/OU=computers/CN=alice-auth.web.cern.ch
 
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 
</nowiki></pre>
 
</nowiki></pre>
Line 469: Line 525:
 
</nowiki></pre>
 
</nowiki></pre>
  
 +
''' Filename: ''' /etc/vomses/alice-voms-alice-auth.app.cern.ch
 +
<pre><nowiki>
 +
"alice" "voms-alice-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=alice-auth.web.cern.ch" "alice"
 +
</nowiki></pre>
  
Notes:  
+
Notes:
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 485: Line 546:
 
<pre><nowiki>
 
<pre><nowiki>
 
/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
 
/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/atlas/voms-atlas-auth.app.cern.ch.lsc
 +
<pre><nowiki>
 +
/DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch
 
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 
</nowiki></pre>
 
</nowiki></pre>
Line 498: Line 565:
 
</nowiki></pre>
 
</nowiki></pre>
  
 +
''' Filename: ''' /etc/vomses/atlas-voms-atlas-auth.app.cern.ch
 +
<pre><nowiki>
 +
"atlas" "voms-atlas-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch" "atlas"
 +
</nowiki></pre>
 +
 +
Notes:
 +
n/a
 +
}}
 +
 +
 +
 +
{{BOX VO|BES|<!-- VOMS RECORDS for BES -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/bes/voms.ihep.ac.cn.lsc
 +
<pre><nowiki>
 +
/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn
 +
/C=CN/O=HEP/CN=Institute of High Energy Physics Certification Authority
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/bes-voms.ihep.ac.cn
 +
<pre><nowiki>
 +
"bes" "voms.ihep.ac.cn" "15001" "/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn" "bes"
 +
</nowiki></pre>
  
 
Notes:
 
Notes:
Line 508: Line 597:
 
''' Filename: ''' /etc/grid-security/vomsdir/biomed/cclcgvomsli01.in2p3.fr.lsc
 
''' Filename: ''' /etc/grid-security/vomsdir/biomed/cclcgvomsli01.in2p3.fr.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr
+
/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr
/C=FR/O=MENESR/OU=GRID-FR/CN=AC GRID-FR Services
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
 
</nowiki></pre>
 
</nowiki></pre>
  
 
''' Filename: ''' /etc/vomses/biomed-cclcgvomsli01.in2p3.fr
 
''' Filename: ''' /etc/vomses/biomed-cclcgvomsli01.in2p3.fr
 
<pre><nowiki>
 
<pre><nowiki>
"biomed" "cclcgvomsli01.in2p3.fr" "15000" "/O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr" "biomed"
+
"biomed" "cclcgvomsli01.in2p3.fr" "15000" "/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr" "biomed"
 
</nowiki></pre>
 
</nowiki></pre>
  
 +
Notes:
 +
n/a
 +
}}
  
Notes:  
+
 
 +
 
 +
{{BOX VO|CALICE|<!-- VOMS RECORDS for CALICE -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/calice/grid-voms.desy.de.lsc
 +
<pre><nowiki>
 +
/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
 +
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/calice-grid-voms.desy.de
 +
<pre><nowiki>
 +
"calice" "grid-voms.desy.de" "15102" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "calice"
 +
</nowiki></pre>
 +
 
 +
Notes:
 +
n/a
 +
}}
 +
 
 +
 
 +
 
 +
{{BOX VO|CEPC|<!-- VOMS RECORDS for CEPC -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/cepc/voms.ihep.ac.cn.lsc
 +
<pre><nowiki>
 +
/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn
 +
/C=CN/O=HEP/CN=Institute of High Energy Physics Certification Authority
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/cepc-voms.ihep.ac.cn
 +
<pre><nowiki>
 +
"cepc" "voms.ihep.ac.cn" "15005" "/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn" "cepc"
 +
</nowiki></pre>
 +
 
 +
Notes:
 +
n/a
 +
}}
 +
 
 +
 
 +
 
 +
{{BOX VO|CERNATSCHOOL.ORG|<!-- VOMS RECORDS for CERNATSCHOOL.ORG -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/cernatschool.org/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/cernatschool.org/voms02.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/grid-security/vomsdir/cernatschool.org/voms03.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/cernatschool.org-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"cernatschool.org" "voms.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "cernatschool.org"
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/cernatschool.org-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"cernatschool.org" "voms02.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "cernatschool.org"
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/cernatschool.org-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"cernatschool.org" "voms03.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "cernatschool.org"
 +
</nowiki></pre>
 +
 
 +
Notes:
 +
n/a
 +
}}
 +
 
 +
 
 +
 
 +
{{BOX VO|CLAS12|<!-- VOMS RECORDS for CLAS12 -->
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
Line 534: Line 705:
 
<pre><nowiki>
 
<pre><nowiki>
 
/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
 
/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
 +
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/cms/voms-cms-auth.app.cern.ch.lsc
 +
<pre><nowiki>
 +
/DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch
 
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 
/DC=ch/DC=cern/CN=CERN Grid Certification Authority
 
</nowiki></pre>
 
</nowiki></pre>
Line 547: Line 724:
 
</nowiki></pre>
 
</nowiki></pre>
  
 +
''' Filename: ''' /etc/vomses/cms-voms-cms-auth.app.cern.ch
 +
<pre><nowiki>
 +
"cms" "voms-cms-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch" "cms"
 +
</nowiki></pre>
  
Notes:  
+
Notes:
 
n/a
 
n/a
 
}}
 
}}
 +
 +
 +
 +
{{BOX VO|COMET.J-PARC.JP|<!-- VOMS RECORDS for COMET.J-PARC.JP -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/comet.j-parc.jp/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/comet.j-parc.jp/voms02.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/comet.j-parc.jp/voms03.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/comet.j-parc.jp-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"comet.j-parc.jp" "voms.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "comet.j-parc.jp"
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/comet.j-parc.jp-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"comet.j-parc.jp" "voms02.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "comet.j-parc.jp"
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/comet.j-parc.jp-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"comet.j-parc.jp" "voms03.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "comet.j-parc.jp"
 +
</nowiki></pre>
 +
 +
Notes:
 +
n/a
 +
}}
 +
  
  
Line 565: Line 787:
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 574: Line 796:
 
''' Filename: ''' /etc/grid-security/vomsdir/dune/voms1.fnal.gov.lsc
 
''' Filename: ''' /etc/grid-security/vomsdir/dune/voms1.fnal.gov.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov
+
/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA
+
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3
 
</nowiki></pre>
 
</nowiki></pre>
  
 
''' Filename: ''' /etc/grid-security/vomsdir/dune/voms2.fnal.gov.lsc
 
''' Filename: ''' /etc/grid-security/vomsdir/dune/voms2.fnal.gov.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov
+
/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA
+
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3
 
</nowiki></pre>
 
</nowiki></pre>
  
 
''' Filename: ''' /etc/vomses/dune-voms1.fnal.gov
 
''' Filename: ''' /etc/vomses/dune-voms1.fnal.gov
 
<pre><nowiki>
 
<pre><nowiki>
"dune" "voms1.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov" "dune"
+
"dune" "voms1.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov" "dune"
 
</nowiki></pre>
 
</nowiki></pre>
  
 
''' Filename: ''' /etc/vomses/dune-voms2.fnal.gov
 
''' Filename: ''' /etc/vomses/dune-voms2.fnal.gov
 
<pre><nowiki>
 
<pre><nowiki>
"dune" "voms2.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov" "dune"
+
"dune" "voms2.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov" "dune"
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|ESR|<!-- VOMS RECORDS for ESR -->
+
 
''' Filename: ''' /etc/grid-security/vomsdir/esr/voms.grid.sara.nl.lsc
+
{{BOX VO|ENMR.EU|<!-- VOMS RECORDS for ENMR.EU -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/enmr.eu/voms2.cnaf.infn.it.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl
+
/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it
/C=NL/O=NIKHEF/CN=NIKHEF medium-security certification auth
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/esr-voms.grid.sara.nl
+
''' Filename: ''' /etc/vomses/enmr.eu-voms2.cnaf.infn.it
 
<pre><nowiki>
 
<pre><nowiki>
"esr" "voms.grid.sara.nl" "30001" "/O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl" "esr"
+
"enmr.eu" "voms2.cnaf.infn.it" "15014" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it" "enmr.eu"
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 652: Line 874:
 
</nowiki></pre>
 
</nowiki></pre>
  
 +
Notes:
 +
n/a
 +
}}
  
Notes:  
+
 
 +
 
 +
{{BOX VO|ESR|<!-- VOMS RECORDS for ESR -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/esr/voms.grid.sara.nl.lsc
 +
<pre><nowiki>
 +
/O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl
 +
/C=NL/O=NIKHEF/CN=NIKHEF medium-security certification auth
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/esr-voms.grid.sara.nl
 +
<pre><nowiki>
 +
"esr" "voms.grid.sara.nl" "30001" "/O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl" "esr"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
 +
 +
 +
 +
{{BOX VO|EUCLIDUK.NET|<!-- VOMS RECORDS for EUCLIDUK.NET -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/eucliduk.net/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/eucliduk.net-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"eucliduk.net" "voms.gridpp.ac.uk" "15518" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "eucliduk.net"
 +
</nowiki></pre>
 +
 +
Notes:
 +
n/a
 +
}}
 +
  
  
Line 661: Line 919:
 
''' Filename: ''' /etc/grid-security/vomsdir/fermilab/voms1.fnal.gov.lsc
 
''' Filename: ''' /etc/grid-security/vomsdir/fermilab/voms1.fnal.gov.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov
+
/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA
+
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3
 
</nowiki></pre>
 
</nowiki></pre>
  
 
''' Filename: ''' /etc/grid-security/vomsdir/fermilab/voms2.fnal.gov.lsc
 
''' Filename: ''' /etc/grid-security/vomsdir/fermilab/voms2.fnal.gov.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov
+
/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA
+
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3
 
</nowiki></pre>
 
</nowiki></pre>
  
 
''' Filename: ''' /etc/vomses/fermilab-voms1.fnal.gov
 
''' Filename: ''' /etc/vomses/fermilab-voms1.fnal.gov
 
<pre><nowiki>
 
<pre><nowiki>
"fermilab" "voms1.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov" "fermilab"
+
"fermilab" "voms1.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov" "fermilab"
 
</nowiki></pre>
 
</nowiki></pre>
  
 
''' Filename: ''' /etc/vomses/fermilab-voms2.fnal.gov
 
''' Filename: ''' /etc/vomses/fermilab-voms2.fnal.gov
 
<pre><nowiki>
 
<pre><nowiki>
"fermilab" "voms2.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=IL/L=Batavia/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov" "fermilab"
+
"fermilab" "voms2.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov" "fermilab"
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 710: Line 968:
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 750: Line 1,008:
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 789: Line 1,047:
 
"hyperk.org" "voms03.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "hyperk.org"
 
"hyperk.org" "voms03.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "hyperk.org"
 
</nowiki></pre>
 
</nowiki></pre>
 
  
 
Notes:
 
Notes:
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 799: Line 1,057:
 
''' Filename: ''' /etc/grid-security/vomsdir/icecube/grid-voms.desy.de.lsc
 
''' Filename: ''' /etc/grid-security/vomsdir/icecube/grid-voms.desy.de.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
+
/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
/C=DE/O=GermanGrid/CN=GridKa-CA
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
 
</nowiki></pre>
 
</nowiki></pre>
  
 
''' Filename: ''' /etc/vomses/icecube-grid-voms.desy.de
 
''' Filename: ''' /etc/vomses/icecube-grid-voms.desy.de
 
<pre><nowiki>
 
<pre><nowiki>
"icecube" "grid-voms.desy.de" "15106" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "icecube"
+
"icecube" "grid-voms.desy.de" "15106" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "icecube"
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 817: Line 1,075:
 
''' Filename: ''' /etc/grid-security/vomsdir/ilc/grid-voms.desy.de.lsc
 
''' Filename: ''' /etc/grid-security/vomsdir/ilc/grid-voms.desy.de.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
+
/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
/C=DE/O=GermanGrid/CN=GridKa-CA
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
 
</nowiki></pre>
 
</nowiki></pre>
  
 
''' Filename: ''' /etc/vomses/ilc-grid-voms.desy.de
 
''' Filename: ''' /etc/vomses/ilc-grid-voms.desy.de
 
<pre><nowiki>
 
<pre><nowiki>
"ilc" "grid-voms.desy.de" "15110" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "ilc"
+
"ilc" "grid-voms.desy.de" "15110" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "ilc"
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 835: Line 1,093:
 
''' Filename: ''' /etc/grid-security/vomsdir/ipv6.hepix.org/voms2.cnaf.infn.it.lsc
 
''' Filename: ''' /etc/grid-security/vomsdir/ipv6.hepix.org/voms2.cnaf.infn.it.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms2.cnaf.infn.it
+
/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it
/C=IT/O=INFN/CN=INFN Certification Authority
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
 
</nowiki></pre>
 
</nowiki></pre>
  
 
''' Filename: ''' /etc/vomses/ipv6.hepix.org-voms2.cnaf.infn.it
 
''' Filename: ''' /etc/vomses/ipv6.hepix.org-voms2.cnaf.infn.it
 
<pre><nowiki>
 
<pre><nowiki>
"ipv6.hepix.org" "voms2.cnaf.infn.it" "15013" "/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms2.cnaf.infn.it" "ipv6.hepix.org"
+
"ipv6.hepix.org" "voms2.cnaf.infn.it" "15013" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it" "ipv6.hepix.org"
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 873: Line 1,131:
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 882: Line 1,140:
 
''' Filename: ''' /etc/grid-security/vomsdir/lsst/voms.slac.stanford.edu.lsc
 
''' Filename: ''' /etc/grid-security/vomsdir/lsst/voms.slac.stanford.edu.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/DC=org/DC=incommon/C=US/ST=CA/L=Stanford/O=Stanford University/OU=SLAC/CN=voms.slac.stanford.edu
+
/DC=org/DC=incommon/C=US/ST=California/O=Stanford University/CN=voms.slac.stanford.edu
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA
+
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3
 
</nowiki></pre>
 
</nowiki></pre>
  
 
''' Filename: ''' /etc/vomses/lsst-voms.slac.stanford.edu
 
''' Filename: ''' /etc/vomses/lsst-voms.slac.stanford.edu
 
<pre><nowiki>
 
<pre><nowiki>
"lsst" "voms.slac.stanford.edu" "15003" "/DC=org/DC=incommon/C=US/ST=CA/L=Stanford/O=Stanford University/OU=SLAC/CN=voms.slac.stanford.edu" "lsst"
+
"lsst" "voms.slac.stanford.edu" "15003" "/DC=org/DC=incommon/C=US/ST=California/O=Stanford University/CN=voms.slac.stanford.edu" "lsst"
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
n/a
 
+
voms.fnal.gov is only an admin interface. It should not be configured on the machines because it cannot give proxies.
+
 
+
Sites supporting lsst are advised to read  GGUS 117587.
+
 
+
(former advice  was "It would not do any harm to have it on service nodes but should not be installed on any UI.")
+
 
+
 
}}
 
}}
 +
  
  
 
{{BOX VO|LZ|<!-- VOMS RECORDS for LZ -->
 
{{BOX VO|LZ|<!-- VOMS RECORDS for LZ -->
''' Filename: ''' /etc/grid-security/vomsdir/lz/lzvoms.grid.hep.ph.ic.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/lz/voms.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=lzvoms.grid.hep.ph.ic.ac.uk
+
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/lz/voms.hep.wisc.edu.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/lz/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/DC=org/DC=incommon/C=US/ST=WI/L=Madison/O=University of Wisconsin-Madison/OU=OCIS/CN=voms.hep.wisc.edu
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/lz-lzvoms.grid.hep.ph.ic.ac.uk
+
''' Filename: ''' /etc/vomses/lz-voms.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"lz" "lzvoms.grid.hep.ph.ic.ac.uk" "15001" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=lzvoms.grid.hep.ph.ic.ac.uk" "lz"
+
"lz" "voms.gridpp.ac.uk" "15517" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "lz"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/lz-voms.hep.wisc.edu
+
''' Filename: ''' /etc/vomses/lz-voms02.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"lz" "voms.hep.wisc.edu" "15001" "/DC=org/DC=incommon/C=US/ST=WI/L=Madison/O=University of Wisconsin-Madison/OU=OCIS/CN=voms.hep.wisc.edu" "lz"
+
"lz" "voms02.gridpp.ac.uk" "15517" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "lz"
 
</nowiki></pre>
 
</nowiki></pre>
 
  
 
Notes:
 
Notes:
 
+
n/a
Daniela Bauer and Simon Fayer provide this information for LZ
+
 
+
  Site Specific Settings:
+
    VO_LZ_SW_DIR&#61;/cvmfs/lz.opensciencegrid.org
+
    VO_LZ_DEFAULT_SE&#61;gfe02.grid.hep.ph.ic.ac.uk
+
 
+
  UI only:
+
    WMS_HOSTS&#61;"wms01.grid.hep.ph.ic.ac.uk"
+
 
+
  Must be set explicitly for the WMS, true elsewhere:
+
    MAP_WILDCARDS&#61;yes
+
 
+
Simon points to this wrt cvmfs:
+
  https://github.com/ATLASConnect/PortableCVMFS/tree/master/conf
+
 
+
 
+
 
}}
 
}}
  
  
{{BOX VO|MAGIC|<!-- VOMS RECORDS for MAGIC -->
 
''' Filename: ''' /etc/grid-security/vomsdir/magic/voms01.pic.es.lsc
 
<pre><nowiki>
 
/DC=org/DC=terena/DC=tcs/C=ES/ST=Barcelona/L=Bellaterra/O=Port dInformacio Cientifica/CN=voms01.pic.es
 
/C=NL/ST=Noord-Holland/L=Amsterdam/O=TERENA/CN=TERENA eScience SSL CA 3
 
</nowiki></pre>
 
  
''' Filename: ''' /etc/grid-security/vomsdir/magic/voms02.pic.es.lsc
+
{{BOX VO|MAGIC|<!-- VOMS RECORDS for MAGIC -->
<pre><nowiki>
+
Notes:
/DC=org/DC=terena/DC=tcs/C=ES/ST=Barcelona/L=Bellaterra/O=Port dInformacio Cientifica/CN=voms02.pic.es
+
/C=NL/ST=Noord-Holland/L=Amsterdam/O=TERENA/CN=TERENA eScience SSL CA 3
+
</nowiki></pre>
+
 
+
''' Filename: ''' /etc/vomses/magic-voms01.pic.es
+
<pre><nowiki>
+
"magic" "voms01.pic.es" "15003" "/DC=org/DC=terena/DC=tcs/C=ES/ST=Barcelona/L=Bellaterra/O=Port dInformacio Cientifica/CN=voms01.pic.es" "magic"
+
</nowiki></pre>
+
 
+
''' Filename: ''' /etc/vomses/magic-voms02.pic.es
+
<pre><nowiki>
+
"magic" "voms02.pic.es" "15003" "/DC=org/DC=terena/DC=tcs/C=ES/ST=Barcelona/L=Bellaterra/O=Port dInformacio Cientifica/CN=voms02.pic.es" "magic"
+
</nowiki></pre>
+
 
+
 
+
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 1,011: Line 1,225:
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 +
}}
  
 +
 +
 +
{{BOX VO|MU3E|<!-- VOMS RECORDS for MU3E -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/mu3e/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/mu3e-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"mu3e" "voms.gridpp.ac.uk" "15516" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "mu3e"
 +
</nowiki></pre>
 +
 +
Notes:
 +
n/a
 
}}
 
}}
 +
  
  
Line 1,022: Line 1,253:
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 
</nowiki></pre>
 
</nowiki></pre>
Line 1,030: Line 1,273:
 
</nowiki></pre>
 
</nowiki></pre>
  
 +
''' Filename: ''' /etc/vomses/na62.vo.gridpp.ac.uk-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"na62.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/na62.vo.gridpp.ac.uk-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"na62.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"
 +
</nowiki></pre>
  
 
Notes:
 
Notes:
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 1,059: Line 1,312:
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 1,099: Line 1,352:
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|PLANCK|<!-- VOMS RECORDS for PLANCK -->
+
 
''' Filename: ''' /etc/grid-security/vomsdir/planck/voms.cnaf.infn.it.lsc
+
{{BOX VO|SKATELESCOPE.EU|<!-- VOMS RECORDS for SKATELESCOPE.EU -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/skatelescope.eu/voms.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it
+
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=IT/O=INFN/CN=INFN Certification Authority
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/planck-voms.cnaf.infn.it
+
''' Filename: ''' /etc/grid-security/vomsdir/skatelescope.eu/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
"planck" "voms.cnaf.infn.it" "15002" "/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it" "planck"
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 
</nowiki></pre>
 
</nowiki></pre>
  
 +
''' Filename: ''' /etc/grid-security/vomsdir/skatelescope.eu/voms03.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
  
Notes:  
+
''' Filename: ''' /etc/vomses/skatelescope.eu-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"skatelescope.eu" "voms.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "skatelescope.eu"
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/skatelescope.eu-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"skatelescope.eu" "voms02.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "skatelescope.eu"
 +
</nowiki></pre>
 +
 
 +
''' Filename: ''' /etc/vomses/skatelescope.eu-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"skatelescope.eu" "voms03.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "skatelescope.eu"
 +
</nowiki></pre>
 +
 
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
 +
  
  
Line 1,157: Line 1,432:
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|SKATELESCOPE.EU|<!-- VOMS RECORDS for SKATELESCOPE.EU -->
+
 
''' Filename: ''' /etc/grid-security/vomsdir/skatelescope.eu/voms.gridpp.ac.uk.lsc
+
{{BOX VO|SOLIDEXPERIMENT.ORG|<!-- VOMS RECORDS for SOLIDEXPERIMENT.ORG -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/solidexperiment.org/voms.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
Line 1,170: Line 1,445:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/skatelescope.eu/voms02.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/solidexperiment.org/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
Line 1,176: Line 1,451:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/skatelescope.eu/voms03.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/solidexperiment.org/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
Line 1,182: Line 1,457:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/skatelescope.eu-voms.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/solidexperiment.org-voms.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"skatelescope.eu" "voms.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "skatelescope.eu"
+
"solidexperiment.org" "voms.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "solidexperiment.org"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/skatelescope.eu-voms02.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/solidexperiment.org-voms02.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"skatelescope.eu" "voms02.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "skatelescope.eu"
+
"solidexperiment.org" "voms02.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "solidexperiment.org"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/skatelescope.eu-voms03.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/solidexperiment.org-voms03.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"skatelescope.eu" "voms03.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "skatelescope.eu"
+
"solidexperiment.org" "voms03.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "solidexperiment.org"
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|SOLIDEXPERIMENT.ORG|<!-- VOMS RECORDS for SOLIDEXPERIMENT.ORG -->
+
 
''' Filename: ''' /etc/grid-security/vomsdir/solidexperiment.org/voms.gridpp.ac.uk.lsc
+
{{BOX VO|T2K.ORG|<!-- VOMS RECORDS for T2K.ORG -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/t2k.org/voms.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
Line 1,210: Line 1,485:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/solidexperiment.org/voms02.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/t2k.org/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
Line 1,216: Line 1,491:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/solidexperiment.org/voms03.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/t2k.org/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
Line 1,222: Line 1,497:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/solidexperiment.org-voms.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/t2k.org-voms.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"solidexperiment.org" "voms.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "solidexperiment.org"
+
"t2k.org" "voms.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "t2k.org"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/solidexperiment.org-voms02.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/t2k.org-voms02.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"solidexperiment.org" "voms02.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "solidexperiment.org"
+
"t2k.org" "voms02.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "t2k.org"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/solidexperiment.org-voms03.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/t2k.org-voms03.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"solidexperiment.org" "voms03.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "solidexperiment.org"
+
"t2k.org" "voms03.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "t2k.org"
 
</nowiki></pre>
 
</nowiki></pre>
  
 +
Notes:
 +
n/a
 +
}}
  
Notes:  
+
 
 +
 
 +
{{BOX VO|UBOONE|<!-- VOMS RECORDS for UBOONE -->
 +
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|SUPERNEMO.ORG|<!-- VOMS RECORDS for SUPERNEMO.ORG -->
+
 
''' Filename: ''' /etc/grid-security/vomsdir/supernemo.org/voms.gridpp.ac.uk.lsc
+
{{BOX VO|VIRGO|<!-- VOMS RECORDS for VIRGO -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/virgo/voms.cnaf.infn.it.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
+
/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms.cnaf.infn.it
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/supernemo.org/voms02.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/vomses/virgo-voms.cnaf.infn.it
 
<pre><nowiki>
 
<pre><nowiki>
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
+
"virgo" "voms.cnaf.infn.it" "15009" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms.cnaf.infn.it" "virgo"
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
+
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/supernemo.org/voms03.gridpp.ac.uk.lsc
+
Notes:
 +
n/a
 +
}}
 +
 
 +
 
 +
 
 +
{{BOX VO|VO.COMPLEX-SYSTEMS.EU|<!-- VOMS RECORDS for VO.COMPLEX-SYSTEMS.EU -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.complex-systems.eu/voms2.hellasgrid.gr.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
+
/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
+
/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/supernemo.org-voms.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/vo.complex-systems.eu-voms2.hellasgrid.gr
 
<pre><nowiki>
 
<pre><nowiki>
"supernemo.org" "voms.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "supernemo.org"
+
"vo.complex-systems.eu" "voms2.hellasgrid.gr" "15160" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "vo.complex-systems.eu"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/supernemo.org-voms02.gridpp.ac.uk
+
Notes:
 +
n/a
 +
}}
 +
 
 +
 
 +
 
 +
{{BOX VO|VO.CTA.IN2P3.FR|<!-- VOMS RECORDS for VO.CTA.IN2P3.FR -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.cta.in2p3.fr/cclcgvomsli01.in2p3.fr.lsc
 
<pre><nowiki>
 
<pre><nowiki>
"supernemo.org" "voms02.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "supernemo.org"
+
/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr
 +
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/supernemo.org-voms03.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/vo.cta.in2p3.fr-cclcgvomsli01.in2p3.fr
 
<pre><nowiki>
 
<pre><nowiki>
"supernemo.org" "voms03.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "supernemo.org"
+
"vo.cta.in2p3.fr" "cclcgvomsli01.in2p3.fr" "15008" "/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr" "vo.cta.in2p3.fr"
 
</nowiki></pre>
 
</nowiki></pre>
 
  
 
Notes:
 
Notes:
Line 1,283: Line 1,578:
  
  
{{BOX VO|T2K.ORG|<!-- VOMS RECORDS for T2K.ORG -->
+
 
''' Filename: ''' /etc/grid-security/vomsdir/t2k.org/voms.gridpp.ac.uk.lsc
+
{{BOX VO|VO.LANDSLIDES.MOSSAIC.ORG|<!-- VOMS RECORDS for VO.LANDSLIDES.MOSSAIC.ORG -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
Line 1,290: Line 1,586:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/t2k.org/voms02.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
Line 1,296: Line 1,592:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/t2k.org/voms03.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
Line 1,302: Line 1,598:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/t2k.org-voms.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/vo.landslides.mossaic.org-voms.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"t2k.org" "voms.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "t2k.org"
+
"vo.landslides.mossaic.org" "voms.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.landslides.mossaic.org"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/t2k.org-voms02.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/vo.landslides.mossaic.org-voms02.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"t2k.org" "voms02.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "t2k.org"
+
"vo.landslides.mossaic.org" "voms02.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.landslides.mossaic.org"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/t2k.org-voms03.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/vo.landslides.mossaic.org-voms03.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"t2k.org" "voms03.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "t2k.org"
+
"vo.landslides.mossaic.org" "voms03.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.landslides.mossaic.org"
 
</nowiki></pre>
 
</nowiki></pre>
 
  
 
Notes:
 
Notes:
Line 1,323: Line 1,618:
  
  
{{BOX VO|ZEUS|<!-- VOMS RECORDS for ZEUS -->
+
 
''' Filename: ''' /etc/grid-security/vomsdir/zeus/grid-voms.desy.de.lsc
+
{{BOX VO|VO.MAGRID.MA|<!-- VOMS RECORDS for VO.MAGRID.MA -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.magrid.ma/voms.magrid.ma.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
+
/C=MA/O=MaGrid/OU=CNRST/CN=voms.magrid.ma
/C=DE/O=GermanGrid/CN=GridKa-CA
+
/C=MA/O=MaGrid/CN=MaGrid CA
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/zeus-grid-voms.desy.de
+
''' Filename: ''' /etc/vomses/vo.magrid.ma-voms.magrid.ma
 
<pre><nowiki>
 
<pre><nowiki>
"zeus" "grid-voms.desy.de" "15112" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "zeus"
+
"vo.magrid.ma" "voms.magrid.ma" "15001" "/C=MA/O=MaGrid/OU=CNRST/CN=voms.magrid.ma" "vo.magrid.ma"
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
  
 
{{BOX VO|CALICE  |<!-- VOMS RECORDS for CALICE -->
 
''' Filename: ''' /etc/grid-security/vomsdir/calice/grid-voms.desy.de.lsc
 
<pre><nowiki>
 
/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
 
/C=DE/O=GermanGrid/CN=GridKa-CA
 
</nowiki></pre>
 
 
''' Filename: ''' /etc/vomses/calice-grid-voms.desy.de
 
<pre><nowiki>
 
"calice" "grid-voms.desy.de" "15102" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "calice"
 
</nowiki></pre>
 
 
 
Notes:
 
n/a
 
}}
 
  
  
{{BOX VO|VO.MOEDAL.ORG |<!-- VOMS RECORDS for VO.MOEDAL.ORG -->
+
{{BOX VO|VO.MOEDAL.ORG|<!-- VOMS RECORDS for VO.MOEDAL.ORG -->
 
''' Filename: ''' /etc/grid-security/vomsdir/vo.moedal.org/lcg-voms2.cern.ch.lsc
 
''' Filename: ''' /etc/grid-security/vomsdir/vo.moedal.org/lcg-voms2.cern.ch.lsc
 
<pre><nowiki>
 
<pre><nowiki>
Line 1,382: Line 1,660:
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|VO.NORTHGRID.AC.UK |<!-- VOMS RECORDS for VO.NORTHGRID.AC.UK -->
+
 
 +
{{BOX VO|VO.NORTHGRID.AC.UK|<!-- VOMS RECORDS for VO.NORTHGRID.AC.UK -->
 
''' Filename: ''' /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms.gridpp.ac.uk.lsc
 
''' Filename: ''' /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
Line 1,422: Line 1,700:
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|VO.SOUTHGRID.AC.UK |<!-- VOMS RECORDS for VO.SOUTHGRID.AC.UK -->
+
 
''' Filename: ''' /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms.gridpp.ac.uk.lsc
+
{{BOX VO|VO.SCOTGRID.AC.UK|<!-- VOMS RECORDS for VO.SCOTGRID.AC.UK -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
Line 1,435: Line 1,713:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms02.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
Line 1,441: Line 1,719:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms03.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
Line 1,447: Line 1,725:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/vo.southgrid.ac.uk-voms.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/vo.scotgrid.ac.uk-voms.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"vo.southgrid.ac.uk" "voms.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.southgrid.ac.uk"
+
"vo.scotgrid.ac.uk" "voms.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.scotgrid.ac.uk"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/vo.southgrid.ac.uk-voms02.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/vo.scotgrid.ac.uk-voms02.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"vo.southgrid.ac.uk" "voms02.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.southgrid.ac.uk"
+
"vo.scotgrid.ac.uk" "voms02.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.scotgrid.ac.uk"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/vo.southgrid.ac.uk-voms03.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/vo.scotgrid.ac.uk-voms03.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"vo.southgrid.ac.uk" "voms03.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.southgrid.ac.uk"
+
"vo.scotgrid.ac.uk" "voms03.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.scotgrid.ac.uk"
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|VO.LANDSLIDES.MOSSAIC.ORG |<!-- VOMS RECORDS for VO.LANDSLIDES.MOSSAIC.ORG -->
+
 
''' Filename: ''' /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms.gridpp.ac.uk.lsc
+
{{BOX VO|VO.SOUTHGRID.AC.UK|<!-- VOMS RECORDS for VO.SOUTHGRID.AC.UK -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
Line 1,475: Line 1,753:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms02.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
Line 1,481: Line 1,759:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms03.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
Line 1,487: Line 1,765:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/vo.landslides.mossaic.org-voms.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/vo.southgrid.ac.uk-voms.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"vo.landslides.mossaic.org" "voms.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.landslides.mossaic.org"
+
"vo.southgrid.ac.uk" "voms.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.southgrid.ac.uk"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/vo.landslides.mossaic.org-voms02.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/vo.southgrid.ac.uk-voms02.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"vo.landslides.mossaic.org" "voms02.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.landslides.mossaic.org"
+
"vo.southgrid.ac.uk" "voms02.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.southgrid.ac.uk"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/vo.landslides.mossaic.org-voms03.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/vo.southgrid.ac.uk-voms03.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"vo.landslides.mossaic.org" "voms03.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.landslides.mossaic.org"
+
"vo.southgrid.ac.uk" "voms03.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.southgrid.ac.uk"
 
</nowiki></pre>
 
</nowiki></pre>
  
 
+
Notes:
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|ENMR.EU |<!-- VOMS RECORDS for ENMR.EU -->
 
''' Filename: ''' /etc/grid-security/vomsdir/enmr.eu/voms-02.pd.infn.it.lsc
 
<pre><nowiki>
 
/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/CN=voms-02.pd.infn.it
 
/C=NL/ST=Noord-Holland/L=Amsterdam/O=TERENA/CN=TERENA eScience SSL CA 3
 
</nowiki></pre>
 
  
''' Filename: ''' /etc/grid-security/vomsdir/enmr.eu/voms2.cnaf.infn.it.lsc
+
{{BOX VO|ZEUS|<!-- VOMS RECORDS for ZEUS -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/zeus/grid-voms.desy.de.lsc
 
<pre><nowiki>
 
<pre><nowiki>
/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms2.cnaf.infn.it
+
/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
/C=IT/O=INFN/CN=INFN Certification Authority
+
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/enmr.eu-voms-02.pd.infn.it
+
''' Filename: ''' /etc/vomses/zeus-grid-voms.desy.de
 
<pre><nowiki>
 
<pre><nowiki>
"enmr.eu" "voms-02.pd.infn.it" "15014" "/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/CN=voms-02.pd.infn.it" "enmr.eu"
+
"zeus" "grid-voms.desy.de" "15112" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "zeus"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/enmr.eu-voms2.cnaf.infn.it
+
Notes:
<pre><nowiki>
+
"enmr.eu" "voms2.cnaf.infn.it" "15014" "/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms2.cnaf.infn.it" "enmr.eu"
+
</nowiki></pre>
+
 
+
 
+
Notes:  
+
 
n/a
 
n/a
 
}}
 
}}
  
  
{{BOX VO|CERNATSCHOOL.ORG |<!-- VOMS RECORDS for CERNATSCHOOL.ORG -->
 
''' Filename: ''' /etc/grid-security/vomsdir/cernatschool.org/voms.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 
</nowiki></pre>
 
  
''' Filename: ''' /etc/grid-security/vomsdir/cernatschool.org/voms02.gridpp.ac.uk.lsc
+
<!-- END OF SIDSECTION -->
<pre><nowiki>
+
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
+
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
+
</nowiki></pre>
+
  
''' Filename: ''' /etc/grid-security/vomsdir/cernatschool.org/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 
</nowiki></pre>
 
  
''' Filename: ''' /etc/vomses/cernatschool.org-voms.gridpp.ac.uk
+
== Not Listed in the EGI Operations Portal ==
<pre><nowiki>
+
 
"cernatschool.org" "voms.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "cernatschool.org"
+
</nowiki></pre>
+
  
''' Filename: ''' /etc/vomses/cernatschool.org-voms02.gridpp.ac.uk
+
{{BOX VO|PLANCK|<!-- VOMS RECORDS for PLANCK -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/planck/voms.cnaf.infn.it.lsc
 
<pre><nowiki>
 
<pre><nowiki>
"cernatschool.org" "voms02.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "cernatschool.org"
+
/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it
 +
/C=IT/O=INFN/CN=INFN Certification Authority
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/cernatschool.org-voms03.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/planck-voms.cnaf.infn.it
 
<pre><nowiki>
 
<pre><nowiki>
"cernatschool.org" "voms03.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "cernatschool.org"
+
"planck" "voms.cnaf.infn.it" "15002" "/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it" "planck"
 
</nowiki></pre>
 
</nowiki></pre>
  
  
Notes:  
+
Notes:
 
n/a
 
n/a
 
}}
 
}}
  
 +
== VO Resource Requirements ==
  
{{BOX VO|EARTHSCIENCE <!-- NO XML --> (Note that this VO is not in the CIC Portal)|<pre><nowiki>
+
<div style="margin:auto; border:2px solid black;background-color:#EEEEEE;width:600px; max-width:97%">
VOMS_SERVERS=TBD
+
<div style="font-size:1.2em; font-weight:bold; padding-left:4px;background-color:#7C8AAF;color:#fff;">Please Note</div>
VOMSES=TBD
+
<div style="padding:3px 6px">
VOMS_CA_DN=TBD
+
Please do not change the table below as it is automatically updated from the EGI Operations Portal.
</nowiki></pre>
+
Any changes you make will be lost.
Notes:
+
</div>
n/a
+
</div>
}}
+
  
  
 
+
<!-- START OF RESOURCES -->{|border="1" cellpadding="3" style="border-collapse:collapse;margin-bottom:40px;width:100%"
<!-- END OF SIDSECTION -->
+
<!-- |+VO Resource Requirements -->
 
+
'''NOTA BENE'''
+
Please do not change by hand the '''VO Resource Requirements''' table below, as it is automatically updated from the CIC Portal.
+
 
+
== VO Resource Requirements ==
+
{|border="1" cellpadding="1"
+
|+VO Resource Requirements
+
 
|-style="background:#7C8AAF;color:white"
 
|-style="background:#7C8AAF;color:white"
 
!VO
 
!VO
Line 1,603: Line 1,847:
 
!Scratch
 
!Scratch
 
!Other
 
!Other
 
 
|-
 
|-
 
|alice
 
|alice
Line 1,611: Line 1,854:
 
|10000
 
|10000
 
|
 
|
 
 
 
|-
 
|-
 
|atlas
 
|atlas
 
|2048
 
|2048
|3120
+
|5760
 
|5760
 
|5760
 
|20000
 
|20000
 
|Additional runtime requirements:
 
|Additional runtime requirements:
_ at least 4GB of VM for each job slot
+
* at least 4GB of VM for each job slot
  
 
Software installation common items:
 
Software installation common items:
_ the full compiler suite (c/c++ and fortran) should be installed in the WNs, including all the compat-gcc-32* and the  SL_libg2c.a_change packages in SL4-like nodes;
+
* the full compiler suite (c/c++ and fortran) should be installed in the WNs, including all the compat-gcc-32* and the  SL_libg2c.a_change packages in SL4-like nodes;
_ the reccommended version of the compilers is 3.4.6;
+
* the reccommended version of the compilers is 3.4.6;
_ the f2c and libgfortran libraries (in both i386 and x86_64 versions, in case of x86_64 systems) are also required to run the software;
+
* the f2c and libgfortran libraries (in both i386 and x86_64 versions, in case of x86_64 systems) are also required to run the software;
_ other libraries required are:
+
* other libraries required are:
  libpopt.so.0
+
:libpopt.so.0
  libblas.so
+
:libblas.so
_ other applications required are: uuencode, uudecode, bc, curl;
+
* other applications required are: uuencode, uudecode, bc, curl;
_ high priority in the batch system for the atlassgm user;
+
* high priority in the batch system for the atlassgm user;
_ for nodes running at 64 bits, a copy of python compiled at 32 bits is also needed to use the 32 bits python bindings in the middleware. See https://twiki.cern.ch/twiki/bin/view/Atlas/RPMcompatSLC4 for more details;
+
* for nodes running at 64 bits, a copy of python compiled at 32 bits is also needed to use the 32 bits python bindings in the middleware. See https://twiki.cern.ch/twiki/bin/view/Atlas/RPMcompatSLC4 for more details;
_ for SL5 nodes please refer to https://twiki.cern.ch/twiki/bin/view/Atlas/RPMCompatSLC5 and https://twiki.cern.ch/twiki/bin/view/Atlas/SL5Migration ;
+
* for SL5 nodes please refer to https://twiki.cern.ch/twiki/bin/view/Atlas/RPMCompatSLC5 and https://twiki.cern.ch/twiki/bin/view/Atlas/SL5Migration ;
_ for SL6 nodes please refer to https://twiki.cern.ch/twiki/bin/view/AtlasComputing/RPMCompatSLC6 and https://twiki.cern.ch/twiki/bin/view/LCG/SL6Migration
+
* for SL6 nodes please refer to https://twiki.cern.ch/twiki/bin/view/AtlasComputing/RPMCompatSLC6 and https://twiki.cern.ch/twiki/bin/view/LCG/SL6Migration
  
 
Software installation setup (cvmfs sites):
 
Software installation setup (cvmfs sites):
_ https://twiki.cern.ch/twiki/bin/view/Atlas/CernVMFS
+
* https://twiki.cern.ch/twiki/bin/view/Atlas/CernVMFS
  
 
Software installation requirements (non-cvmfs sites):
 
Software installation requirements (non-cvmfs sites):
_ an experimental software area (shared filesystem) with at least 500 GB free and reserved for ATLAS.
+
* an experimental software area (shared filesystem) with at least 500 GB free and reserved for ATLAS.
 
+
 
+
 
|-
 
|-
 
|biomed
 
|biomed
Line 1,649: Line 1,888:
 
|100
 
|100
 
|For sites providing an SE, minimal required storage space is 1TB.
 
|For sites providing an SE, minimal required storage space is 1TB.
 
 
 
|-
 
|-
 
|calice
 
|calice
Line 1,659: Line 1,896:
 
|CVMFS is used for the software distribution via:
 
|CVMFS is used for the software distribution via:
  
  /cvmfs/calice.desy.de  
+
:/cvmfs/calice.desy.de  
  
 
For setup instructions refer to:
 
For setup instructions refer to:
  
  http://grid.desy.de/cvmfs
+
: http://grid.desy.de/cvmfs
 
+
 
+
 
|-
 
|-
 
|cernatschool.org
 
|cernatschool.org
Line 1,673: Line 1,908:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
 
|cms
 
|cms
Line 1,681: Line 1,914:
 
|4320
 
|4320
 
|20000
 
|20000
|Note: The 'Resources' table from above is meant per core. CMS usually sends 8-core pilots.
+
|Note: CMS usually sends 8-core pilots, values for 'Multi Core' refer to that. Single-core pilots are discouraged.
  
 
Jobs require an address space larger than the memory size specified
 
Jobs require an address space larger than the memory size specified
Line 1,701: Line 1,934:
 
National VOMS groups:
 
National VOMS groups:
 
In CMS national VOMS groups, e.g. /cms/becms or /cms/dcms, are used. Those proxies must be "supported" at all sites in the following way:
 
In CMS national VOMS groups, e.g. /cms/becms or /cms/dcms, are used. Those proxies must be "supported" at all sites in the following way:
_ glexec must not fail
+
* should be treated like /cms (base group), in case no special treated is wanted by the site
_ should be treated like /cms (base group), in case no special treated is wanted by the site
+
* proxies with such national groups must be able to write to /store/user/temp (the PFN associated to this LFN)
_ proxies with such national groups must be able to write to /store/user/temp (the PFN associated to this LFN)
+
|-
 
+
|comet.j-parc.jp
 
+
|2048
 +
|1440
 +
|2880
 +
|40960
 +
|
 
|-
 
|-
 
|dteam
 
|dteam
Line 1,713: Line 1,950:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
 
|dune
 
|dune
|4000
+
|0
 
|2880
 
|2880
 
|2880
 
|2880
 
|10000
 
|10000
 
|
 
|
 
 
 
|-
 
|-
 
|enmr.eu
 
|enmr.eu
|1000
+
|8000
 
|2880
 
|2880
 
|4320
 
|4320
 
|1000
 
|1000
|1) The line:
+
|1) For COVID-19 related jobs, slots with 8 GB/Core are required
 +
 
 +
# WeNMR software area must be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS and https://wiki.egi.eu/wiki/PROC22. Please do not forget to define on all WNs the environment variable VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu, as pointed out in the above documents.
 +
 
 +
# The line:
 
"/enmr.eu/*"::::
 
"/enmr.eu/*"::::
 
has to be added to group.conf file before configuring via yaim the grid services.  
 
has to be added to group.conf file before configuring via yaim the grid services.  
Line 1,741: Line 1,978:
 
of the file /etc/grid-security/groupmapfile.
 
of the file /etc/grid-security/groupmapfile.
 
It is required to enable whatever VO group added for implementing per-application accounting.
 
It is required to enable whatever VO group added for implementing per-application accounting.
 
2) Further, multiple queues should ideally be enabled with different Job Wall Clock Time limits:
 
_ very short: 30 minutes max - for NAGIOS probes, that run with the VO FQAN:
 
/enmr.eu/ops/Role=NULL/Capability=NULL
 
_ short : 120 minutes max
 
_ medium : 12 hours max
 
_ long : 48 hours
 
 
3) A WeNMR supported application, Gromacs, run in multithreading mode on multiprocessor boxes (MPI not needed).
 
Please inform the VO managers if your site does not support this kind of jobs.
 
 
4) WeNMR software area must be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS and https://wiki.egi.eu/wiki/PROC22. Please do not forget to define on all WNs the environment variable VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu, as pointed out in the above documents.
 
 
 
 
|-
 
|-
 
|epic.vo.gridpp.ac.uk
 
|epic.vo.gridpp.ac.uk
Line 1,762: Line 1,985:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
 
|esr
 
|esr
Line 1,779: Line 2,000:
 
No permanent storage needed but transient and durable.
 
No permanent storage needed but transient and durable.
 
Low-latency scheduling for short jobs needed.
 
Low-latency scheduling for short jobs needed.
 
 
 
|-
 
|-
 
|fermilab
 
|fermilab
Line 1,788: Line 2,007:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
 
|geant4
 
|geant4
Line 1,802: Line 2,019:
 
CernVM-FS needs to be accessed on WN. CernVM-FS Cache area needed is about
 
CernVM-FS needs to be accessed on WN. CernVM-FS Cache area needed is about
 
5GB.
 
5GB.
 
 
 
|-
 
|-
 
|gridpp
 
|gridpp
Line 1,811: Line 2,026:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
 
|hyperk.org
 
|hyperk.org
|2000
+
|0
 
|1440
 
|1440
 
|1440
 
|1440
 
|10000
 
|10000
 
|
 
|
 
 
 
|-
 
|-
 
|icecube
 
|icecube
Line 1,831: Line 2,042:
  
 
/cvmfs/icecube.opensciencegrid.org
 
/cvmfs/icecube.opensciencegrid.org
 
 
 
|-
 
|-
 
|ilc
 
|ilc
Line 1,841: Line 2,050:
 
|CVMFS is used for the software distribution via:
 
|CVMFS is used for the software distribution via:
  
  /cvmfs/ilc.desy.de  
+
:/cvmfs/ilc.desy.de  
  
 
For setup instructions refer to:
 
For setup instructions refer to:
  
  http://grid.desy.de/cvmfs
+
: http://grid.desy.de/cvmfs
 
+
 
+
 
|-
 
|-
 
|ipv6.hepix.org
 
|ipv6.hepix.org
Line 1,855: Line 2,062:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
|vo.landslides.mossaic.org
+
|lhcb
 
|0
 
|0
|0
 
|0
 
|0
 
|
 
 
 
|-
 
|lhcb
 
|4000
 
 
|0
 
|0
 
|0
 
|0
Line 1,878: Line 2,074:
 
The amount of space in field "Max size of scratch space used by jobs", shall be interpreted as 50 % each for downloaded input files and produced output files.  
 
The amount of space in field "Max size of scratch space used by jobs", shall be interpreted as 50 % each for downloaded input files and produced output files.  
  
Sites having migrated to the Centos7 (including "Cern Centos7") operating system or later versions are requested to provide support for singularity containers and user namespaces. The latter can be checked by ensuring that /proc/sys/user/max_user_namespaces contains a large number.  
+
Sites should have the Centos7 or "Cern Centos7" operating system, or later versions, installed on their worker nodes. CPUs should support the x86_64_v2 instruction set (or later). Sites are requested to provide support for singularity containers and user namespaces. The latter can be checked by ensuring that /proc/sys/user/max_user_namespaces contains a large number.  
  
 
The underlying OS should provide the libraries, binaries, and scripts required by the current HEP_OSlibs RPM meta package.
 
The underlying OS should provide the libraries, binaries, and scripts required by the current HEP_OSlibs RPM meta package.
  
The shared software area shall be provided via CVMFS. LHCb uses the mount points /cvmfs/lhcb.cern.ch, /cvmfs/lhcb-condb.cern.ch, /cvmfs/grid.cern.ch and /cvmfs/cernvm-prod.cern.ch on the worker nodes.  
+
The shared software area shall be provided via CVMFS. LHCb uses the mount points  
 +
:      "/cvmfs/lhcb.cern.ch/",
 +
:      "/cvmfs/lhcb-condb.cern.ch/",
 +
:      "/cvmfs/lhcbdev.cern.ch/",
 +
:      "/cvmfs/unpacked.cern.ch/",
 +
:      "/cvmfs/cernvm-prod.cern.ch/",
 +
on the worker nodes.  
  
 
Provisioning of a reasonable number of slots per disk server, proportional to the maximum number of concurrent jobs at the site.
 
Provisioning of a reasonable number of slots per disk server, proportional to the maximum number of concurrent jobs at the site.
Line 1,888: Line 2,090:
 
Non T1 sites providing CVMFS, direct HTCondorCE, ARC, or CREAM submission and the requested amount of local scratch space will be considered as candidates for additional workloads (e.g. data reprocessing campaign).
 
Non T1 sites providing CVMFS, direct HTCondorCE, ARC, or CREAM submission and the requested amount of local scratch space will be considered as candidates for additional workloads (e.g. data reprocessing campaign).
  
Sites not having an SRM installation must provide:   
+
Sites with disk storage must provide:   
_ disk only storage
+
:- an xroot endpoint (single DNS entry), at least for reading
_ a GRIDFPT endpoint (a single dns entry)
+
:- an HTTPS endpoint (single DNS entry), both read and write, supporting Third Party Copy
_ an XROOT endpoint (a single dns entry)
+
:- a way to do the accounting (preferably following the WLCG TF standard: https://twiki.cern.ch/twiki/bin/view/LCG/StorageSpaceAccounting)  
a way to do the accounting (preferably following the WLCG TF standard: https://twiki.cern.ch/twiki/bin/view/LCG/StorageSpaceAccounting)
+
 
+
  
 +
Sites with tape storage should be accessible from the other Tier1 and Tier2 sites. They should provide one of the supported WLCG tape systems (dCache or CTA). Tape classes to optimize data distribution is to be discusses on a per-site basis.
 
|-
 
|-
 
|lsst
 
|lsst
Line 1,902: Line 2,103:
 
|0
 
|0
 
|VO name must be "lsst" as it is an existing VO in OSG!
 
|VO name must be "lsst" as it is an existing VO in OSG!
cf VOMS URL  
+
cf VOMS URL
 
+
 
+
 
+
 
|-
 
|-
 
|lz
 
|lz
Line 1,913: Line 2,111:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
 
|magic
 
|magic
Line 1,922: Line 2,118:
 
|0
 
|0
 
|Fortran77 and other compilers. See details in annex of MoU (documentation section).
 
|Fortran77 and other compilers. See details in annex of MoU (documentation section).
 
 
 
|-
 
|-
 
|mice
 
|mice
Line 1,931: Line 2,125:
 
|0
 
|0
 
|
 
|
 
 
|-
 
|vo.moedal.org
 
|0
 
|0
 
|0
 
|0
 
|
 
 
 
 
|-
 
|-
 
|na62.vo.gridpp.ac.uk
 
|na62.vo.gridpp.ac.uk
Line 1,951: Line 2,134:
  
 
Need also access to /cvmfs/geant4.cern.ch and /cvmfs/sft.cern.ch
 
Need also access to /cvmfs/geant4.cern.ch and /cvmfs/sft.cern.ch
 
 
|-
 
|vo.northgrid.ac.uk
 
|0
 
|0
 
|0
 
|0
 
|
 
 
 
 
|-
 
|-
 
|ops
 
|ops
Line 1,969: Line 2,141:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
 
|pheno
 
|pheno
Line 1,978: Line 2,148:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
|planck
+
|skatelescope.eu
 +
|0
 
|0
 
|0
|950
 
 
|0
 
|0
 
|0
 
|0
|Need access to job output during execution
 
Need R-GMA for monitoring
 
RAM 1GB
 
Scratch 200GB
 
SE for durable files (not permanent)
 
Java/Perl/Python/C/C++/Fortran90,-95,-77/Octave
 
IDL(commercial) where available
 
 
 
|-
 
|skatelescope.eu
 
|2000
 
|2880
 
|2880
 
|40000
 
 
|
 
|
 
 
 
|-
 
|-
 
|snoplus.snolab.ca
 
|snoplus.snolab.ca
Line 2,017: Line 2,168:
  
 
SNO+ software area should be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS.
 
SNO+ software area should be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS.
 
 
 
|-
 
|-
 
|solidexperiment.org
 
|solidexperiment.org
Line 2,026: Line 2,175:
 
|0
 
|0
 
|will need to set up CVMFS.
 
|will need to set up CVMFS.
 
 
 
|-
 
|-
|vo.southgrid.ac.uk
+
|t2k.org
 +
|1500
 +
|600
 +
|600
 +
|1000
 +
|t2k.org software should be mounted on WNs via CVMFS as defined at https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS
 +
|-
 +
|virgo
 
|0
 
|0
 
|0
 
|0
Line 2,035: Line 2,189:
 
|0
 
|0
 
|
 
|
 
 
 
|-
 
|-
|supernemo.org
+
|vo.complex-systems.eu
|null
+
|0
|null
+
|0
|null
+
|0
|null
+
|0
 
|
 
|
 
 
 
|-
 
|-
|t2k.org
+
|vo.cta.in2p3.fr
|1500
+
|0
|600
+
|0
|600
+
|2000
|1000
+
|0
|t2k.org software should be mounted on WNs via CVMFS as defined at https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS
+
|
 
+
|-
 
+
|vo.landslides.mossaic.org
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.magrid.ma
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.moedal.org
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.northgrid.ac.uk
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.scotgrid.ac.uk
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.southgrid.ac.uk
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 
|-
 
|-
 
|zeus
 
|zeus
Line 2,063: Line 2,253:
 
|CVMFS is used for the software distribution via:
 
|CVMFS is used for the software distribution via:
  
  /cvmfs/zeus.desy.de  
+
:/cvmfs/zeus.desy.de  
  
 
For setup instructions refer to:
 
For setup instructions refer to:
  
  http://grid.desy.de/cvmfs
+
: http://grid.desy.de/cvmfs
 +
|}
 +
<!-- END OF RESOURCES -->
  
  
|-style="background:#7C8AAF;color:white"
 
|Maximum:
 
|4000
 
|5000
 
|5760
 
|40000
 
|
 
  
|}
+
== VO Activity ==
==VO enablement  ==
+
 
 +
The VOs that are enabled at each site are listed in [https://pprc.qmul.ac.uk/~lloyd/ukmetrics/ukmetrics.php?page=vos this VO table].
 +
 
 +
 
  
The VOs that are enabled at each site are listed in a [http://pprc.qmul.ac.uk/~lloyd/gridpp/votable.html VO table].
 
                                                       
 
 
[[Category:GridPP Deployment]]
 
[[Category:GridPP Deployment]]
 
[[Category:VOMS]]
 
[[Category:VOMS]]
  
{{KeyDocs|responsible=Steve Jones|reviewdate=2020-02-24|accuratedate=2020-02-24|percentage=100}}
+
<!-- START UPDATE DATE -->{{KeyDocs|responsible=Gerard Hand|reviewdate=2024/04/23, 14:40:12|accuratedate=2024/04/23|percentage=100}}<!-- END UPDATE DATE -->

Latest revision as of 13:44, 23 April 2024

Introduction

The GridPP Project Management Board (PMB) has agreed that up to 10 % of GridPP's processing capability should be allocated for non-LHC work. VOs that access the Grid like this must become Approved VOs; policies for managing approved VOs are described here: Policies_for_GridPP_approved_VOs.

The tables below indicate VOs that the GridPP Project Management Board has approved, and the PMB encourages support for these VOs at all of its collaborating sites. Information about all European Grid Initiative (EGI), global and local VOs is given in the EGI Operations portal which is the main reference source for VO information (including VO manager, end-points, requirements etc.).


Please Note

Please do not change the vomsdir/ or vomses/ entries or the VO Resource Requirements section below, as they are automatically updated from the EGI Operations Portal. Any changes you make will be lost!


Approved VOs

Name Area Contact
alice The ALICE Collaboration is operating a dedicated heavy-ion detector to exploit the unique physics potential of nucleus-nucleus interactions at LHC energies. Our aim is to study the physics of strongly interacting matter at extreme energy densities, where the formation of a new phase of matter, the quark-gluon plasma, is expected. Latchezar.Betev@cern.ch
Maarten.Litmaath@cern.ch
costin.grigoras@cern.ch
bes Beijing Spectrometer (BES) is a general-purpose detector located in the interaction region of the BEPC storage ring, where the electron and positron beams collide. The BES Collaboration consists of approximately 200 physicists and engineers from 27 institutions in 4 countries.
biomed This VO covers the areas related to health and life sciences. Currently, it is divided into 3 sectors: medical imaging, bioinformatics and drug discovery. The VO is openly accessible to academics, and to private company for non-commercial purposes. glatard@creatis.insa-lyon.fr
jerome.pansanel@iphc.cnrs.fr
sorina.pop@creatis.insa-lyon.fr
glatard@creatis.insa-lyon.fr
calice CAlorimeter for the LInear Collider Experiment

A high granularity calorimeter optimised for the Particle Flow measurement of multi-jets final state at the International Linear Collider running at a center-of-mass between 90 GeV and 1 TeV.

thomas.hartmann@desy.de
andreas.gellrich@desy.de
cepc The Circular Electron Positron Collider (CEPC) is a large international scientific facility proposed by the Chinese particle physics community in 2012
cernatschool.org The CERN@school VO represents the CERN@school project on the Grid. CERN@school aims to bring CERN technology into the classroom to aid with the teaching of physics and to inspire the next generation of scientists and engineers. The CERN@school VO will allow students and teachers involved with the project to harness GridPP to store and analyse data from the CERN@school detectors, the LUCID experiment and the associated GEANT4 simulations.
clas12
vo.complex-systems.eu The goal of the vo.complex-systems.eu is to promote the study of

complex systems and complex networks on the Grid infrastructure. The vo.complex-systems.eu Virtual Organization will also serve as the building layer of collaboration among international scientists focusing on the research area of Complexity Science.

romain.reuillon@iscpif.fr
comet.j-parc.jp Muon-to-electron conversion experiment at J-PARC, which will be used by international COMET collaborators for design studies and data analysis. COMET will test Beyond-the-Standard-Model physics in a way that is complementary to the experiments at the LHC. daniela.bauer@imperial.ac.uk
Yoshi.Uchida@imperial.ac.uk
simon.fayer05@imperial.ac.uk
dteam The goal of the VO is to facilitate the deployment of a stable production Grid infrastructure. To this end, members of this VO (who have to be associated with a registered site and be involved in its operation) are allowed to run tests to validate the correct configuration of their site. Site performance evaluation and/or monitoring programs may also be run under the DTEAM VO with the approval of the Site Manager, subject to the agreement of the affected sites' management. kkoum@admin.grnet.gr
alessandro.paolini@egi.eu
matthew.viljoen@egi.eu
kyrginis@admin.grnet.gr
enmr.eu Structural biology and life sciences in general, and NMR in particular, have always been associated with advanced computing. The current challenges in the post-genomic era call for virtual research platforms that provide the worldwide research community with both user-friendly tools, platforms for data analysis and exchange, and an underlying e-infrastructure. WeNMR groups different research teams into a worldwide virtual research community. It builds on the established eNMR e-Infrastructure and its steadily growing virtual organization, which is currently the second largest VO in the area of life sciences. WeNMR provides an e-Infrastructure platform and Science Gateway for structural biology towards EGI for the users of existing infrastructures. It involves researchers from around the world and will build bridges to other areas of structural biology. Integration with SAXS, a rapidly growing and highly complementary method, is directly included in WeNMR, but links will also be established to related initiatives. WeNMR will serve all relevant INSTRUCT communities in line with the ESFRI roadmap. Marco.Verlato@pd.infn.it
a.m.j.j.bonvin@uu.nl
rosato@cerm.unifi.it
giachetti@cerm.unifi.it
verlato@infn.it
epic.vo.gridpp.ac.uk EPIC replaces an earlier EPIC project that was focused upon Veterinary Surveillance (Phase 1). This new consortium EPIC project aims to become a world leader in policy linked research and includes some of Scotland’s leading veterinary epidemiologists and scientists.

The overarching purpose for the Centre is to provide access to high quality advice and analyses on the epidemiology and control of animal diseases that are important to Scotland, and to best prepare Scotland for the next major disease incursion. Ultimately, this strategic advice to the Scottish Government will help ensure that the interests of the various stakeholders involved in disease emergency planning and response are met as effectively as possible. This all must be achieved within the context of our rapidly changing environment. For example, issues such as climate change are now influencing the livestock disease risks that Scotland faces.

thomas.doherty@glasgow.ac.uk
esr The Earth Science Research covers research in the fields of Solid Earth, Ocean, Atmosphere and their interfaces. A large variety of communities correspond to each domain, some of them covering several domains.


In the ESR Virtual Organization (ESR-VO) four domains are represented:

  1. Earth Observation
  2. Climate
  3. Hydrology
  4. Solid Earth Physics
andre.gemuend@scai.fraunhofer.de
weissenb@ccr.jussieu.fr
weissenb@ccr.jussieu.fr
weissenb@ccr.jussieu.fr
fermilab Fermilab Virtual Organization (VO) - The Fermilab VO is an "umbrella" VO that includes the Fermilab Campus Grid (FermiGrid) and Fermilab Grid Testing (ITB) infrastructures, and all Fermilab computing activities that are not big enough to have their own Virtual Organization. Broadly these include the intensity frontier program, theoretical simulations, fixed target analysis, and accelerator and beamline design as well as activities performed by the Fermilab Campus Grid administrators. garzoglio@fnal.gov
boyd@fnal.gov
geant4 Geant4 is a toolkit for the simulation of the passage of particles through matter. Its areas of application include high energy, nuclear and accelerator physics, as well as studies in medical and space science. The two main reference papers for Geant4 are published in Nuclear Instruments and Methods in Physics Research A 506 (2003) 250-303, and IEEE Transactions on Nuclear Science 53 No. 1 (2006) 270-278. Andrea.Sciaba@cern.ch
Andrea.Sciaba@cern.ch
Andrea.Dotti@cern.ch
gridpp GridPP is a collaboration of particle physicists and computer scientists from the UK and CERN. They are building a distributed computing Grid across the UK for particle physicists. At the moment there is a working particle physics Grid across 17 UK institutions. m.doidge@lancaster.ac.uk
hyperk.org We propose the Hyper-Kamiokande (Hyper-K) detector as a next generation underground water Cherenkov detector. It will serve as a far detector of a long baseline neutrino oscillation experiment envisioned for the upgraded J-PARC, and as a detector capable of observing -- far beyond the sensitivity of the Super-Kamiokande (Super-K) detector -- proton decays, atmospheric neutrinos, and neutrinos from astronomical origins. The baseline design of Hyper-K is based on the highly successful Super-K, taking full advantage of a well-proven technology. " C.J.Walker@qmul.ac.uk
francesca.di_lodovico@kcl.ac.uk
icecube The goal of the VO is to enable the usage of Grid resources for ICECUBE collaboration members, mainly for simulation and reconstruction. thomas.hartmann@desy.de
andreas.gellrich@desy.de
andreas.haupt@desy.de
ilc VO for the International Linear Collider Community. thomas.hartmann@desy.de
andreas.gellrich@desy.de
Christoph.Wissing@desy.de
ipv6.hepix.org The goal of the VO is to carry out testing of IPv6 readiness, functionality and performance of the middleware, applications and tools required by the stakeholder communities, especially HEP. Other authorised activities include use of the testbed by related IPv6 activities inside EGI, the related middleware technology providers and other Infrastructures used by WLCG/HEP. david.kelsey@stfc.ac.uk
lz This VO will support LUX Zeplin experiment designed to search Dark Matter. E.Korolkova@sheffield.ac.uk
j.dobson@ucl.ac.uk
magic MAGIC is a system of two imaging atmospheric Cherenkov telescopes (or IACTs). MAGIC-I started routine operation after commissioning in 2004. Construction of MAGIC-II was completed in early 2009, and the two telescopes have been in operation ever since, with a break in 2012 for an upgrade that achieved full homogeneity. The project is funded primarily by the funding agencies BMFB (Germany), MPG (Germany), INFN (Italy), MICINN(Spain), and the ETH Zurich (Switzerland). neissner@pic.es
contrera@gae.ucm.es
rfirpo@pic.es
vo.magrid.ma VO vo.magrid.ma is a multidisciplinary VO providing general grid services and support to Moroccan scientific community rahim@cnrst.ma
mice A VO to support the activities of the Muon Ionisation Cooling Experiment (MICE). Specifically it is to enable the moving of MICE data around the Grid followed by the submission of analysis to these data. This is expected to be a small VO. d.colling@imperial.ac.uk
p.hodgson@sheffield.ac.uk
daniela.bauer@imperial.ac.uk
janusz.martyniak@imperial.ac.uk
uboone MicroBooNE is a large 170-ton liquid-argon time projection chamber (LArTPC) neutrino experiment located on the Booster neutrino beamline at Fermilab
mu3e The Mu3e experiment is a new search for the lepton-flavour violating decay of a positive muon into two positrons and one electron.
na62.vo.gridpp.ac.uk The NA62 VO (na62.vo.gridpp.ac.uk) is meant to provide grid computing and data storage resources to the NA62 collaboration. The NA62 VO is supported by University of Cambridge, University of Glasgow, Imperial College London, University of Birmingham, University of Lancaster, University of Liverpool, University of Manchester, Oxford University and RAL (from UK), CERN, CNAF (Italy) and UCL (Belgium). More info about the NA62 experiment can be found on http://na62.web.cern.ch/na62/. The production portal is located at http://na62.gla.ac.uk/ Dan.Protopopescu@glasgow.ac.uk
David.Britton@glasgow.ac.uk
ops The goal of the VO is to facilitate the operations of the LCG/EGI infrastructure, which includes running official monitoring, re-certification and performance evaluation tools. Additionally the VO will be used for interoperations with other grid infrastructures. eimamagi@srce.hr
alessandro.paolini@egi.eu
pheno Phenogrid is the VO for UK theorists that don't fit within one of the LHC experiments (e.g. developers of Monte Carlos etc.) The rest of this text exists only to satisfy the extremely unnecessary minimum limit of 200 characters. jeppe.andersen@durham.ac.uk
adam.j.boutcher@durham.ac.uk
paul.clark@durham.ac.uk
snoplus.snolab.ca VO for the snoplus experiment, a multi-purpose liquid scintillator neutrino experiment based in Sudbury, Canada. Members of the snoplus virtual organisation will contribute to the European computing effort to accurately simulate the SNOplus detector response. Jeanne.wilson@kcl.ac.uk
C.J.Walker@qmul.ac.uk
m.mottram@qmul.ac.uk
solidexperiment.org support grid user of the SoLid experiment. daniela.bauer@imperial.ac.uk
antonin.vacheret@imperial.ac.uk
t2k.org T2K is a neutrino experiment designed to investigate how neutrinos change from one flavour to another as they travel (neutrino oscillations). An intense beam of muon neutrinos is generated at the J-PARC nuclear physics site on the East coast of Japan and directed across the country to the Super-Kamiokande neutrino detector in the mountains of western Japan. The beam is measured once before it leaves the J-PARC site, using the near detector ND280, and again at Super-K: the change in the measured intensity and composition of the beam is used to provide information on the properties of neutrinos. sophie.king@kcl.ac.uk
tomislav.vladisavljevic@stfc.ac.uk
virgo Scientific target: detection of gravitational waves. Gravitational waves are predicted by the General Theory of Relativity but still not directly detected due to their extremely weak interaction with matter. Large interferometric detectors, like Virgo, are operating with the aim of directly detecting gravitational signals from various astrophysical sources. Signals are expected to be deeply buried into detector noise and suitable data analysis algorithm are developed in order to allow detection and signal parameter estimation. For many kind of searches large computing resources are needed and in some important cases we are computationally bound: the larger is the available computing power and the wider is the portion of source parameter space that can be explored.

VO target: to allow data management and computationally intensive data analysis

cristiano.palomba@roma1.infn.it
alberto.colla@roma1.infn.it
vo.landslides.mossaic.org A virtual organisation for landslide modellers associated with the Management of Slope Stability in Communities (MoSSaiC) project. The VO is used for running landslide modelling software such as CHASM and QUESTA. l.kreczko@bristol.ac.uk
vo.moedal.org The MoEDAL VO allows members of the MoEDAL Collaboration to perform all of the computing activities relevant for the MoEDAL experiment, making use of available resources according to the policy defined by the Collaboration. t.whyntie@qmul.ac.uk
daniel.felea@cern.ch
vo.northgrid.ac.uk Regional Virtual Organisation created to allow access to HEP resources to other local disciplines from Northgrid sites: Manchester, Lancaster, Liverpool, Sheffield. Users from these universities can apply. alessandra.forti@cern.ch
robert.frank@manchester.ac.uk
robert.frank@manchester.ac.uk
vo.scotgrid.ac.uk The VO is for academic and other users in Scotland to test access to EGI resources. Users will join this VO before deciding whether to setup one of their own for long term access. It is also designed as a test VO to allow maintenance and operational testing of site services. garth.roy@glasgow.ac.uk
vo.southgrid.ac.uk The VO is for academic and other users in the SouthGrid (UKI-SOUTHGRID-BHAM-HEP,UKI-SOUTHGRID-BRIS-HEP,UKI-SOUTHGRID-CAM-HEP,UKI-SOUTHGRID-OX-HEP,UKI-SOUTHGRID-RALPP, UKI-SOUTHGRID-SUSX) region to test access to EGI resources. Users will join this VO before deciding

whether to setup one of their own for long term access.

pete.gronbech@physics.ox.ac.uk
zeus ZEUS is a collaboration of about 450 physicists who are running a large particle detector at the electron-proton collider HERA at the DESY laboratory in Hamburg. The ZEUS detector is a sophisticated tool for studying the particle reactions provided by the high-energetic beams of the HERA accelerator. Thus the participating scientists are pushing forward our knowledge of the fundamental particles and forces of nature, gaining unsurpassed insight into the exciting laws of the microcosm. thomas.hartmann@desy.de
andreas.gellrich@desy.de

IRIS Partners

Name Area Contact
atlas The ATLAS VO allow the members of the ATLAS collaboration to perform all the computing activities relevant for the ATLAS experiment, making use of the available resources following the policy defined by the Collaboration. Alessandro.DeSalvo@roma1.infn.it
Elisabetta.Vilucchi@lnf.infn.it
jd@bnl.gov
james.william.walder@cern.ch
cms The Compact Muon Solenoid (CMS) experiment is a large general-purpose particle physics detectors built on the proton-proton Large Hadron Collider (LHC) at CERN in Switzerland. Andreas.Pfeiffer@cern.ch
stefano.belforte@cern.ch
stefano.belforte@ts.infn.it
Daniele.Bonacorsi@bo.infn.it
Christoph.Wissing@desy.de
sexton@gmail.com
lammel@fnal.gov
jose.hernandez@ciemat.es
Daniele.Bonacorsi@bo.infn.it
gutsche@fnal.gov
Andrea.Sciaba@cern.ch
vo.cta.in2p3.fr Monte Carlo simulations production and analysis for the "CTA - Cherenkov Telescopes Array"

international consortium.

cecile.barbier@lapp.in2p3.fr
arrabito@in2p3.fr
dune DUNE is the Deep Underground Neutrino Experiment managed by the global DUNE collaboration and hosted at Fermilab. We are building a deep-underground Liquid-Argon based neutrino detector to study accelerator-based neutrino oscillations, supernova neutrinos, and nucleon decay. andrew.mcnab@cern.ch
timm@fnal.gov
lhcb The LHCb (Large Hadron Collider Beauty) experiment is mainly set on finding the solution to the mistery of the matter-antimatter imbalance in the Universe. andrew.mcnab@cern.ch
concezio.bozzi@cern.ch
christophe.denis.haen@cern.ch
jan.van.eldik@cern.ch
joel.closier@cern.ch
ben.couturier@cern.ch
joel.closier@cern.ch
lsst Large Synoptic Survey Telescope or LSST is a large aperture wide field survey telescope and 3200 Megapixel camera to image faint astronomical objects, rapidly scan the sky and observe probes for dark matter and dark enegy. LSST Data Management and Simulation jobs will run on OSG and EGI. boutigny@in2p3.fr
IGoodenow@lsst.org
fabio@in2p3.fr
yangw@SLAC.stanford.edu
kherner@fnal.gov
skatelescope.eu The Square Kilometre Array (SKA) project is an international effort to build the world’s largest radio telescope, with eventually over a square kilometre (one million square metres) of collecting area. The scale of the SKA represents a huge leap forward in both engineering and research & development towards building and delivering a unique instrument, with the detailed design and preparation now well under way. As one of the largest scientific endeavours in history, the SKA will bring together a wealth of the world’s finest scientists, engineers and policy makers to bring the project to fruition.

The vo skatelescope.eu is the vo supporting this project.

alessandra.forti@cern.ch
andrew.mcnab@cern.ch
rohini.joshi@manchester.ac.uk
eucliduk.net The Euclid mission aims at understanding why the expansion of the Universe is accelerating and what is the nature of the source responsible for this acceleration which physicists refer to as dark energy. msh@roe.ac.uk

Approved VOs being established into GridPP infrastructure

As part of its commitment to various projects, the GridPP PMB has approved the establishment of the following VOs (your site can not yet support these but when the VO is setup and functioning we will let you know.)

Name Area Contact

VOs that have been removed from approved list

The table below comprises a history of VOs that have been removed from the approved list for various reasons.

Name Date of removal Notes
babar 9 Oct 2013 none
camont 7th June 2017 none
camont.gridpp.ac.uk 9 Oct 2013 none
cdf 7th June 2017 none
cedar 9 Oct 2013 none
dzero 7th June 2017 none
fusion 30 Jan 2017 Discussion with Rubén Vallés Pérez. VO appears defunct.
hone 24 Nov 2015 Discussed at Ops Meeting. Defunct.
ltwo 9 Oct 2013 none
minos.vo.gridpp.ac.uk 9 Oct 2013 none
na48 9 Oct 2013 none
neiss 7th June 2017 none


ngs.ac.uk 9 Oct 2013 none
superbvo.org 19 Jan 2016 Discussed at Ops Meeting. Defunct.
supernemo.vo.eu-egee.org 24 Feb 2020 now called supernemo.org
totalep 9 Oct 2013 none
vo.londongrid.ac.uk in progress [GGUS] VO not used any more
vo.sixt.cern.ch 11 Nov 2015 No members, no voms servers, defunct

Example site-info.def entries

The examples of site-info.def entries for yaim have been moved: Example site-info.def entries

Please Note

Please do not change the vomsdir/ or vomses/ entries below, as they are automatically updated from the EGI Operations Portal. Any changes you make will be lost!


Virtual Organisation: ALICE

Filename: /etc/grid-security/vomsdir/alice/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/alice/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/alice/voms-alice-auth.app.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=alice-auth.web.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/alice-lcg-voms2.cern.ch

"alice" "lcg-voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "alice"

Filename: /etc/vomses/alice-voms2.cern.ch

"alice" "voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "alice"

Filename: /etc/vomses/alice-voms-alice-auth.app.cern.ch

"alice" "voms-alice-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=alice-auth.web.cern.ch" "alice"

Notes: n/a


Virtual Organisation: ATLAS

Filename: /etc/grid-security/vomsdir/atlas/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/atlas/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/atlas/voms-atlas-auth.app.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/atlas-lcg-voms2.cern.ch

"atlas" "lcg-voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "atlas"

Filename: /etc/vomses/atlas-voms2.cern.ch

"atlas" "voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "atlas"

Filename: /etc/vomses/atlas-voms-atlas-auth.app.cern.ch

"atlas" "voms-atlas-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch" "atlas"

Notes: n/a


Virtual Organisation: BES

Filename: /etc/grid-security/vomsdir/bes/voms.ihep.ac.cn.lsc

/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn
/C=CN/O=HEP/CN=Institute of High Energy Physics Certification Authority

Filename: /etc/vomses/bes-voms.ihep.ac.cn

"bes" "voms.ihep.ac.cn" "15001" "/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn" "bes"

Notes: n/a


Virtual Organisation: BIOMED

Filename: /etc/grid-security/vomsdir/biomed/cclcgvomsli01.in2p3.fr.lsc

/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/biomed-cclcgvomsli01.in2p3.fr

"biomed" "cclcgvomsli01.in2p3.fr" "15000" "/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr" "biomed"

Notes: n/a


Virtual Organisation: CALICE

Filename: /etc/grid-security/vomsdir/calice/grid-voms.desy.de.lsc

/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/calice-grid-voms.desy.de

"calice" "grid-voms.desy.de" "15102" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "calice"

Notes: n/a


Virtual Organisation: CEPC

Filename: /etc/grid-security/vomsdir/cepc/voms.ihep.ac.cn.lsc

/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn
/C=CN/O=HEP/CN=Institute of High Energy Physics Certification Authority

Filename: /etc/vomses/cepc-voms.ihep.ac.cn

"cepc" "voms.ihep.ac.cn" "15005" "/C=CN/O=HEP/OU=CC/O=IHEP/CN=voms.ihep.ac.cn" "cepc"

Notes: n/a


Virtual Organisation: CERNATSCHOOL.ORG

Filename: /etc/grid-security/vomsdir/cernatschool.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/cernatschool.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/cernatschool.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/cernatschool.org-voms.gridpp.ac.uk

"cernatschool.org" "voms.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "cernatschool.org"

Filename: /etc/vomses/cernatschool.org-voms02.gridpp.ac.uk

"cernatschool.org" "voms02.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "cernatschool.org"

Filename: /etc/vomses/cernatschool.org-voms03.gridpp.ac.uk

"cernatschool.org" "voms03.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "cernatschool.org"

Notes: n/a


Virtual Organisation: CLAS12

Notes: n/a


Virtual Organisation: CMS

Filename: /etc/grid-security/vomsdir/cms/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/cms/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/cms/voms-cms-auth.app.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/cms-lcg-voms2.cern.ch

"cms" "lcg-voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "cms"

Filename: /etc/vomses/cms-voms2.cern.ch

"cms" "voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "cms"

Filename: /etc/vomses/cms-voms-cms-auth.app.cern.ch

"cms" "voms-cms-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch" "cms"

Notes: n/a


Virtual Organisation: COMET.J-PARC.JP

Filename: /etc/grid-security/vomsdir/comet.j-parc.jp/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/comet.j-parc.jp/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/comet.j-parc.jp/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/comet.j-parc.jp-voms.gridpp.ac.uk

"comet.j-parc.jp" "voms.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "comet.j-parc.jp"

Filename: /etc/vomses/comet.j-parc.jp-voms02.gridpp.ac.uk

"comet.j-parc.jp" "voms02.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "comet.j-parc.jp"

Filename: /etc/vomses/comet.j-parc.jp-voms03.gridpp.ac.uk

"comet.j-parc.jp" "voms03.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "comet.j-parc.jp"

Notes: n/a


Virtual Organisation: DTEAM

Filename: /etc/grid-security/vomsdir/dteam/voms2.hellasgrid.gr.lsc

/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr
/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016

Filename: /etc/vomses/dteam-voms2.hellasgrid.gr

"dteam" "voms2.hellasgrid.gr" "15004" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "dteam"

Notes: n/a


Virtual Organisation: DUNE

Filename: /etc/grid-security/vomsdir/dune/voms1.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3

Filename: /etc/grid-security/vomsdir/dune/voms2.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3

Filename: /etc/vomses/dune-voms1.fnal.gov

"dune" "voms1.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov" "dune"

Filename: /etc/vomses/dune-voms2.fnal.gov

"dune" "voms2.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov" "dune"

Notes: n/a


Virtual Organisation: ENMR.EU

Filename: /etc/grid-security/vomsdir/enmr.eu/voms2.cnaf.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/enmr.eu-voms2.cnaf.infn.it

"enmr.eu" "voms2.cnaf.infn.it" "15014" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it" "enmr.eu"

Notes: n/a


Virtual Organisation: EPIC.VO.GRIDPP.AC.UK

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15507" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms02.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms03.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Notes: n/a


Virtual Organisation: ESR

Filename: /etc/grid-security/vomsdir/esr/voms.grid.sara.nl.lsc

/O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl
/C=NL/O=NIKHEF/CN=NIKHEF medium-security certification auth

Filename: /etc/vomses/esr-voms.grid.sara.nl

"esr" "voms.grid.sara.nl" "30001" "/O=dutchgrid/O=hosts/OU=sara.nl/CN=voms.grid.sara.nl" "esr"

Notes: n/a


Virtual Organisation: EUCLIDUK.NET

Filename: /etc/grid-security/vomsdir/eucliduk.net/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/eucliduk.net-voms.gridpp.ac.uk

"eucliduk.net" "voms.gridpp.ac.uk" "15518" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "eucliduk.net"

Notes: n/a


Virtual Organisation: FERMILAB

Filename: /etc/grid-security/vomsdir/fermilab/voms1.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3

Filename: /etc/grid-security/vomsdir/fermilab/voms2.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3

Filename: /etc/vomses/fermilab-voms1.fnal.gov

"fermilab" "voms1.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms1.fnal.gov" "fermilab"

Filename: /etc/vomses/fermilab-voms2.fnal.gov

"fermilab" "voms2.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/CN=voms2.fnal.gov" "fermilab"

Notes: n/a


Virtual Organisation: GEANT4

Filename: /etc/grid-security/vomsdir/geant4/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/geant4/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/geant4-lcg-voms2.cern.ch

"geant4" "lcg-voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "geant4"

Filename: /etc/vomses/geant4-voms2.cern.ch

"geant4" "voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "geant4"

Notes: n/a


Virtual Organisation: GRIDPP

Filename: /etc/grid-security/vomsdir/gridpp/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/gridpp/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/gridpp/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/gridpp-voms.gridpp.ac.uk

"gridpp" "voms.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "gridpp"

Filename: /etc/vomses/gridpp-voms02.gridpp.ac.uk

"gridpp" "voms02.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "gridpp"

Filename: /etc/vomses/gridpp-voms03.gridpp.ac.uk

"gridpp" "voms03.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "gridpp"

Notes: n/a


Virtual Organisation: HYPERK.ORG

Filename: /etc/grid-security/vomsdir/hyperk.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/hyperk.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/hyperk.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/hyperk.org-voms.gridpp.ac.uk

"hyperk.org" "voms.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "hyperk.org"

Filename: /etc/vomses/hyperk.org-voms02.gridpp.ac.uk

"hyperk.org" "voms02.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "hyperk.org"

Filename: /etc/vomses/hyperk.org-voms03.gridpp.ac.uk

"hyperk.org" "voms03.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "hyperk.org"

Notes: n/a


Virtual Organisation: ICECUBE

Filename: /etc/grid-security/vomsdir/icecube/grid-voms.desy.de.lsc

/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/icecube-grid-voms.desy.de

"icecube" "grid-voms.desy.de" "15106" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "icecube"

Notes: n/a


Virtual Organisation: ILC

Filename: /etc/grid-security/vomsdir/ilc/grid-voms.desy.de.lsc

/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/ilc-grid-voms.desy.de

"ilc" "grid-voms.desy.de" "15110" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "ilc"

Notes: n/a


Virtual Organisation: IPV6.HEPIX.ORG

Filename: /etc/grid-security/vomsdir/ipv6.hepix.org/voms2.cnaf.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/ipv6.hepix.org-voms2.cnaf.infn.it

"ipv6.hepix.org" "voms2.cnaf.infn.it" "15013" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms2.cnaf.infn.it" "ipv6.hepix.org"

Notes: n/a


Virtual Organisation: LHCB

Filename: /etc/grid-security/vomsdir/lhcb/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/lhcb/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/lhcb-lcg-voms2.cern.ch

"lhcb" "lcg-voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "lhcb"

Filename: /etc/vomses/lhcb-voms2.cern.ch

"lhcb" "voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "lhcb"

Notes: n/a


Virtual Organisation: LSST

Filename: /etc/grid-security/vomsdir/lsst/voms.slac.stanford.edu.lsc

/DC=org/DC=incommon/C=US/ST=California/O=Stanford University/CN=voms.slac.stanford.edu
/C=US/O=Internet2/CN=InCommon RSA IGTF Server CA 3

Filename: /etc/vomses/lsst-voms.slac.stanford.edu

"lsst" "voms.slac.stanford.edu" "15003" "/DC=org/DC=incommon/C=US/ST=California/O=Stanford University/CN=voms.slac.stanford.edu" "lsst"

Notes: n/a


Virtual Organisation: LZ

Filename: /etc/grid-security/vomsdir/lz/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/lz/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/lz-voms.gridpp.ac.uk

"lz" "voms.gridpp.ac.uk" "15517" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "lz"

Filename: /etc/vomses/lz-voms02.gridpp.ac.uk

"lz" "voms02.gridpp.ac.uk" "15517" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "lz"

Notes: n/a


Virtual Organisation: MAGIC

Notes: n/a


Virtual Organisation: MICE

Filename: /etc/grid-security/vomsdir/mice/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/mice/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/mice/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/mice-voms.gridpp.ac.uk

"mice" "voms.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "mice"

Filename: /etc/vomses/mice-voms02.gridpp.ac.uk

"mice" "voms02.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "mice"

Filename: /etc/vomses/mice-voms03.gridpp.ac.uk

"mice" "voms03.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "mice"

Notes: n/a


Virtual Organisation: MU3E

Filename: /etc/grid-security/vomsdir/mu3e/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/mu3e-voms.gridpp.ac.uk

"mu3e" "voms.gridpp.ac.uk" "15516" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "mu3e"

Notes: n/a


Virtual Organisation: NA62.VO.GRIDPP.AC.UK

Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms.gridpp.ac.uk

"na62.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"

Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms02.gridpp.ac.uk

"na62.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"

Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms03.gridpp.ac.uk

"na62.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"

Notes: n/a


Virtual Organisation: OPS

Filename: /etc/grid-security/vomsdir/ops/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/ops/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/ops-lcg-voms2.cern.ch

"ops" "lcg-voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "ops"

Filename: /etc/vomses/ops-voms2.cern.ch

"ops" "voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "ops"

Notes: n/a


Virtual Organisation: PHENO

Filename: /etc/grid-security/vomsdir/pheno/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/pheno/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/pheno/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/pheno-voms.gridpp.ac.uk

"pheno" "voms.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "pheno"

Filename: /etc/vomses/pheno-voms02.gridpp.ac.uk

"pheno" "voms02.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "pheno"

Filename: /etc/vomses/pheno-voms03.gridpp.ac.uk

"pheno" "voms03.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "pheno"

Notes: n/a


Virtual Organisation: SKATELESCOPE.EU

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/skatelescope.eu-voms.gridpp.ac.uk

"skatelescope.eu" "voms.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "skatelescope.eu"

Filename: /etc/vomses/skatelescope.eu-voms02.gridpp.ac.uk

"skatelescope.eu" "voms02.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "skatelescope.eu"

Filename: /etc/vomses/skatelescope.eu-voms03.gridpp.ac.uk

"skatelescope.eu" "voms03.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "skatelescope.eu"

Notes: n/a


Virtual Organisation: SNOPLUS.SNOLAB.CA

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/snoplus.snolab.ca-voms.gridpp.ac.uk

"snoplus.snolab.ca" "voms.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "snoplus.snolab.ca"

Filename: /etc/vomses/snoplus.snolab.ca-voms02.gridpp.ac.uk

"snoplus.snolab.ca" "voms02.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "snoplus.snolab.ca"

Filename: /etc/vomses/snoplus.snolab.ca-voms03.gridpp.ac.uk

"snoplus.snolab.ca" "voms03.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "snoplus.snolab.ca"

Notes: n/a


Virtual Organisation: SOLIDEXPERIMENT.ORG

Filename: /etc/grid-security/vomsdir/solidexperiment.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/solidexperiment.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/solidexperiment.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/solidexperiment.org-voms.gridpp.ac.uk

"solidexperiment.org" "voms.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "solidexperiment.org"

Filename: /etc/vomses/solidexperiment.org-voms02.gridpp.ac.uk

"solidexperiment.org" "voms02.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "solidexperiment.org"

Filename: /etc/vomses/solidexperiment.org-voms03.gridpp.ac.uk

"solidexperiment.org" "voms03.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "solidexperiment.org"

Notes: n/a


Virtual Organisation: T2K.ORG

Filename: /etc/grid-security/vomsdir/t2k.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/t2k.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/t2k.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/t2k.org-voms.gridpp.ac.uk

"t2k.org" "voms.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "t2k.org"

Filename: /etc/vomses/t2k.org-voms02.gridpp.ac.uk

"t2k.org" "voms02.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "t2k.org"

Filename: /etc/vomses/t2k.org-voms03.gridpp.ac.uk

"t2k.org" "voms03.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "t2k.org"

Notes: n/a


Virtual Organisation: UBOONE

Notes: n/a


Virtual Organisation: VIRGO

Filename: /etc/grid-security/vomsdir/virgo/voms.cnaf.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms.cnaf.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/virgo-voms.cnaf.infn.it

"virgo" "voms.cnaf.infn.it" "15009" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms.cnaf.infn.it" "virgo"

Notes: n/a


Virtual Organisation: VO.COMPLEX-SYSTEMS.EU

Filename: /etc/grid-security/vomsdir/vo.complex-systems.eu/voms2.hellasgrid.gr.lsc

/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr
/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016

Filename: /etc/vomses/vo.complex-systems.eu-voms2.hellasgrid.gr

"vo.complex-systems.eu" "voms2.hellasgrid.gr" "15160" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "vo.complex-systems.eu"

Notes: n/a


Virtual Organisation: VO.CTA.IN2P3.FR

Filename: /etc/grid-security/vomsdir/vo.cta.in2p3.fr/cclcgvomsli01.in2p3.fr.lsc

/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/vo.cta.in2p3.fr-cclcgvomsli01.in2p3.fr

"vo.cta.in2p3.fr" "cclcgvomsli01.in2p3.fr" "15008" "/DC=org/DC=terena/DC=tcs/C=FR/ST=Paris/O=Centre national de la recherche scientifique/CN=cclcgvomsli01.in2p3.fr" "vo.cta.in2p3.fr"

Notes: n/a


Virtual Organisation: VO.LANDSLIDES.MOSSAIC.ORG

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.landslides.mossaic.org-voms.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.landslides.mossaic.org"

Filename: /etc/vomses/vo.landslides.mossaic.org-voms02.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms02.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.landslides.mossaic.org"

Filename: /etc/vomses/vo.landslides.mossaic.org-voms03.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms03.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.landslides.mossaic.org"

Notes: n/a


Virtual Organisation: VO.MAGRID.MA

Filename: /etc/grid-security/vomsdir/vo.magrid.ma/voms.magrid.ma.lsc

/C=MA/O=MaGrid/OU=CNRST/CN=voms.magrid.ma
/C=MA/O=MaGrid/CN=MaGrid CA

Filename: /etc/vomses/vo.magrid.ma-voms.magrid.ma

"vo.magrid.ma" "voms.magrid.ma" "15001" "/C=MA/O=MaGrid/OU=CNRST/CN=voms.magrid.ma" "vo.magrid.ma"

Notes: n/a


Virtual Organisation: VO.MOEDAL.ORG

Filename: /etc/grid-security/vomsdir/vo.moedal.org/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/vo.moedal.org/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/vo.moedal.org-lcg-voms2.cern.ch

"vo.moedal.org" "lcg-voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "vo.moedal.org"

Filename: /etc/vomses/vo.moedal.org-voms2.cern.ch

"vo.moedal.org" "voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "vo.moedal.org"

Notes: n/a


Virtual Organisation: VO.NORTHGRID.AC.UK

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.northgrid.ac.uk-voms.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.northgrid.ac.uk"

Filename: /etc/vomses/vo.northgrid.ac.uk-voms02.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms02.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.northgrid.ac.uk"

Filename: /etc/vomses/vo.northgrid.ac.uk-voms03.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms03.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.northgrid.ac.uk"

Notes: n/a


Virtual Organisation: VO.SCOTGRID.AC.UK

Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.scotgrid.ac.uk-voms.gridpp.ac.uk

"vo.scotgrid.ac.uk" "voms.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.scotgrid.ac.uk"

Filename: /etc/vomses/vo.scotgrid.ac.uk-voms02.gridpp.ac.uk

"vo.scotgrid.ac.uk" "voms02.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.scotgrid.ac.uk"

Filename: /etc/vomses/vo.scotgrid.ac.uk-voms03.gridpp.ac.uk

"vo.scotgrid.ac.uk" "voms03.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.scotgrid.ac.uk"

Notes: n/a


Virtual Organisation: VO.SOUTHGRID.AC.UK

Filename: /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.southgrid.ac.uk-voms.gridpp.ac.uk

"vo.southgrid.ac.uk" "voms.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.southgrid.ac.uk"

Filename: /etc/vomses/vo.southgrid.ac.uk-voms02.gridpp.ac.uk

"vo.southgrid.ac.uk" "voms02.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.southgrid.ac.uk"

Filename: /etc/vomses/vo.southgrid.ac.uk-voms03.gridpp.ac.uk

"vo.southgrid.ac.uk" "voms03.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.southgrid.ac.uk"

Notes: n/a


Virtual Organisation: ZEUS

Filename: /etc/grid-security/vomsdir/zeus/grid-voms.desy.de.lsc

/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/zeus-grid-voms.desy.de

"zeus" "grid-voms.desy.de" "15112" "/DC=org/DC=terena/DC=tcs/C=DE/ST=Hamburg/O=Deutsches Elektronen-Synchrotron DESY/CN=grid-voms.desy.de" "zeus"

Notes: n/a



Not Listed in the EGI Operations Portal

Virtual Organisation: PLANCK

Filename: /etc/grid-security/vomsdir/planck/voms.cnaf.infn.it.lsc

/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it
/C=IT/O=INFN/CN=INFN Certification Authority

Filename: /etc/vomses/planck-voms.cnaf.infn.it

"planck" "voms.cnaf.infn.it" "15002" "/C=IT/O=INFN/OU=Host/L=CNAF/CN=voms.cnaf.infn.it" "planck"


Notes: n/a

VO Resource Requirements

Please Note

Please do not change the table below as it is automatically updated from the EGI Operations Portal. Any changes you make will be lost.


VO Ram/Core MaxCPU MaxWall Scratch Other
alice 2000 1320 1500 10000
atlas 2048 5760 5760 20000 Additional runtime requirements:
  • at least 4GB of VM for each job slot

Software installation common items:

  • the full compiler suite (c/c++ and fortran) should be installed in the WNs, including all the compat-gcc-32* and the SL_libg2c.a_change packages in SL4-like nodes;
  • the reccommended version of the compilers is 3.4.6;
  • the f2c and libgfortran libraries (in both i386 and x86_64 versions, in case of x86_64 systems) are also required to run the software;
  • other libraries required are:
libpopt.so.0
libblas.so

Software installation setup (cvmfs sites):

Software installation requirements (non-cvmfs sites):

  • an experimental software area (shared filesystem) with at least 500 GB free and reserved for ATLAS.
biomed 100 1 1 100 For sites providing an SE, minimal required storage space is 1TB.
calice 2048 3600 5400 15000 CVMFS is used for the software distribution via:
/cvmfs/calice.desy.de

For setup instructions refer to:

http://grid.desy.de/cvmfs
cernatschool.org 0 0 0 0
cms 2000 2880 4320 20000 Note: CMS usually sends 8-core pilots, values for 'Multi Core' refer to that. Single-core pilots are discouraged.

Jobs require an address space larger than the memory size specified above. Sites should allow processes to use at least 6GB of virtual address space more per core than memory to accommodate the large amount of shared libraries used by jobs. (For a typical 8-core pilot that would translate into a VZSIZE limit of at least 64GB.)

Cloud resources should provision 8-core VMs to match standard 8-core pilots.

Input I/O requirement is an average 2.5 MB/s per thread from MSS.

All jobs need to have outbound connectivity.

Sites must not use pool accounts for the FQAN cms:/cms/Role=lcgadmin . For any other CMS job, sites need to use pool accounts so that at any time every grid credential is mapped to an independent local account.


National VOMS groups: In CMS national VOMS groups, e.g. /cms/becms or /cms/dcms, are used. Those proxies must be "supported" at all sites in the following way:

  • should be treated like /cms (base group), in case no special treated is wanted by the site
  • proxies with such national groups must be able to write to /store/user/temp (the PFN associated to this LFN)
comet.j-parc.jp 2048 1440 2880 40960
dteam 0 0 0 0
dune 0 2880 2880 10000
enmr.eu 8000 2880 4320 1000 1) For COVID-19 related jobs, slots with 8 GB/Core are required
  1. WeNMR software area must be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS and https://wiki.egi.eu/wiki/PROC22. Please do not forget to define on all WNs the environment variable VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu, as pointed out in the above documents.
  1. The line:

"/enmr.eu/*":::: has to be added to group.conf file before configuring via yaim the grid services. In the CREAM-CE this reflects in the lines: "/enmr.eu/*/Role=NULL/Capability=NULL" .enmr "/enmr.eu/*" .enmr of both the files /etc/grid-security/grid-mapfile and /etc/grid-security/voms-grid-mapfile, and in the lines: "/enmr.eu/*/Role=NULL/Capability=NULL" enmr "/enmr.eu/*" enmr of the file /etc/grid-security/groupmapfile. It is required to enable whatever VO group added for implementing per-application accounting.

epic.vo.gridpp.ac.uk 0 0 0 0
esr 2048 2100 0 0 Many applications only need part of the following. Java/Perl/Python/C/C++/FORTRAN77,-90,-95; IDL and MATLAB runtime; Scilab or Octave. Needs MPI for some applications.

Some applications require access to job output during execution, some even interaction via X11. 1 GB RAM; some applications need 3 GB RAM. Outbound connectivity from WN to databases. Shared file system needed for MPI applications, with about 10 GB of space. There are applications needing about 1000 simultaneously open files. Depending on application, output file sizes from some MB to 5 GB, for a total of several hundred thousand files. No permanent storage needed but transient and durable. Low-latency scheduling for short jobs needed.

fermilab 0 0 0 0
geant4 1000 650 850 300 Software is distributed via CernVM-FS

(http://cernvm.cern.ch/portal/filesystem), configuration should include geant4.cern.ch<http://geant4.cern.ch> and dependency (sft.cern.ch<http://sft.cern.ch>, grid.cern.ch<http://grid.cern.ch>) areas.

CernVM-FS needs to be accessed on WN. CernVM-FS Cache area needed is about 5GB.

gridpp 1000 1000 0 0
hyperk.org 0 1440 1440 10000
icecube 4000 2880 2880 40000 CVMFS is used for the software distribution via:

/cvmfs/icecube.opensciencegrid.org

ilc 2048 3600 5400 15000 CVMFS is used for the software distribution via:
/cvmfs/ilc.desy.de

For setup instructions refer to:

http://grid.desy.de/cvmfs
ipv6.hepix.org 0 0 0 0
lhcb 0 0 0 20000 Further recommendations from LHCb for sites:

The amount of memory in the field "Max used physical non-swap X86_64 memory size" of the resources section is understood to be the virtual memory required per single process of a LHCb payload. Usually LHCb payloads consist of one "worker process", consuming the majority of memory, and several wrapper processes. The total amount of virtual memory for all wrapper processes accounts for 1 GB which needs to be added as a requirement to the field "Max used physical non-swap X86_64 memory size" in case the virtual memory of the whole process tree is monitored.

The amount of space in field "Max size of scratch space used by jobs", shall be interpreted as 50 % each for downloaded input files and produced output files.

Sites should have the Centos7 or "Cern Centos7" operating system, or later versions, installed on their worker nodes. CPUs should support the x86_64_v2 instruction set (or later). Sites are requested to provide support for singularity containers and user namespaces. The latter can be checked by ensuring that /proc/sys/user/max_user_namespaces contains a large number.

The underlying OS should provide the libraries, binaries, and scripts required by the current HEP_OSlibs RPM meta package.

The shared software area shall be provided via CVMFS. LHCb uses the mount points

"/cvmfs/lhcb.cern.ch/",
"/cvmfs/lhcb-condb.cern.ch/",
"/cvmfs/lhcbdev.cern.ch/",
"/cvmfs/unpacked.cern.ch/",
"/cvmfs/cernvm-prod.cern.ch/",
on the worker nodes. 

Provisioning of a reasonable number of slots per disk server, proportional to the maximum number of concurrent jobs at the site.

Non T1 sites providing CVMFS, direct HTCondorCE, ARC, or CREAM submission and the requested amount of local scratch space will be considered as candidates for additional workloads (e.g. data reprocessing campaign).

Sites with disk storage must provide:

- an xroot endpoint (single DNS entry), at least for reading
- an HTTPS endpoint (single DNS entry), both read and write, supporting Third Party Copy
- a way to do the accounting (preferably following the WLCG TF standard: https://twiki.cern.ch/twiki/bin/view/LCG/StorageSpaceAccounting)

Sites with tape storage should be accessible from the other Tier1 and Tier2 sites. They should provide one of the supported WLCG tape systems (dCache or CTA). Tape classes to optimize data distribution is to be discusses on a per-site basis.

lsst 0 0 0 0 VO name must be "lsst" as it is an existing VO in OSG!

cf VOMS URL

lz 0 0 0 0
magic 1024 5000 0 0 Fortran77 and other compilers. See details in annex of MoU (documentation section).
mice 0 0 0 0
na62.vo.gridpp.ac.uk 2048 500 720 2048 VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.cern.ch

Need also access to /cvmfs/geant4.cern.ch and /cvmfs/sft.cern.ch

ops 0 0 0 0
pheno 0 0 0 0
skatelescope.eu 0 0 0 0
snoplus.snolab.ca 2000 1440 2160 20000 g++

gcc python-devel uuid-devel zlib-devel

SNO+ software area should be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS.

solidexperiment.org 0 0 0 0 will need to set up CVMFS.
t2k.org 1500 600 600 1000 t2k.org software should be mounted on WNs via CVMFS as defined at https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS
virgo 0 0 0 0
vo.complex-systems.eu 0 0 0 0
vo.cta.in2p3.fr 0 0 2000 0
vo.landslides.mossaic.org 0 0 0 0
vo.magrid.ma 0 0 0 0
vo.moedal.org 0 0 0 0
vo.northgrid.ac.uk 0 0 0 0
vo.scotgrid.ac.uk 0 0 0 0
vo.southgrid.ac.uk 0 0 0 0
zeus 2048 3600 5400 5000 CVMFS is used for the software distribution via:
/cvmfs/zeus.desy.de

For setup instructions refer to:

http://grid.desy.de/cvmfs


VO Activity

The VOs that are enabled at each site are listed in this VO table.

This page is a Key Document, and is the responsibility of Gerard Hand. It was last reviewed on 2024/04/23, 14:40:12 when it was considered to be 100% complete. It was last judged to be accurate on 2024/04/23.