Difference between revisions of "GridPP approved VOs new"

From GridPP Wiki
Jump to: navigation, search
Line 46: Line 46:
 
!Area
 
!Area
 
!Contact
 
!Contact
 +
<!-- start egi approved list -->
 
|-
 
|-
 
|[https://alice-collaboration.web.cern.ch/ alice]
 
|[https://alice-collaboration.web.cern.ch/ alice]
 
|The ALICE Collaboration is operating a dedicated heavy-ion detector to exploit the unique physics potential of nucleus-nucleus interactions at LHC energies. Our aim is to study the physics of strongly interacting matter at extreme energy densities, where the formation of a new phase of matter, the quark-gluon plasma, is expected.
 
|The ALICE Collaboration is operating a dedicated heavy-ion detector to exploit the unique physics potential of nucleus-nucleus interactions at LHC energies. Our aim is to study the physics of strongly interacting matter at extreme energy densities, where the formation of a new phase of matter, the quark-gluon plasma, is expected.
 
|[mailto:Latchezar.Betev@cern.ch Latchezar Betev]<br>[mailto:Maarten.Litmaath@cern.ch Maarten Litmaath]<br>[mailto:costin.grigoras@cern.ch Costin Grigoras]<br>[mailto:w.rabit@thehole.com White Rabit]
 
|[mailto:Latchezar.Betev@cern.ch Latchezar Betev]<br>[mailto:Maarten.Litmaath@cern.ch Maarten Litmaath]<br>[mailto:costin.grigoras@cern.ch Costin Grigoras]<br>[mailto:w.rabit@thehole.com White Rabit]
 +
 
|-
 
|-
 
|[https://www.racf.bnl.gov/docs/howto/grid/joinvo atlas]
 
|[https://www.racf.bnl.gov/docs/howto/grid/joinvo atlas]
 
|The ATLAS VO allow the members of the ATLAS collaboration to perform all the computing activities relevant for the ATLAS experiment, making use of the available resources following the policy defined by the Collaboration.
 
|The ATLAS VO allow the members of the ATLAS collaboration to perform all the computing activities relevant for the ATLAS experiment, making use of the available resources following the policy defined by the Collaboration.
 
|[mailto:Alessandro.DeSalvo@roma1.infn.it Alessandro De Salvo]<br>[mailto:Simone.Campana@cern.ch Simone Campana]<br>[mailto:Alessandro.Di.Girolamo@cern.ch Alessandro Di Girolamo]<br>[mailto:jhover@bnl.gov John Hover]<br>[mailto:Luca.Vaccarossa@mi.infn.it Luca Vaccarossa]<br>[mailto:Elisabetta.Vilucchi@lnf.infn.it Elisabetta Vilucchi]<br>[mailto:jd@bnl.gov John De Stefano Jr]<br>[mailto:james@somewhere.com James Walder]
 
|[mailto:Alessandro.DeSalvo@roma1.infn.it Alessandro De Salvo]<br>[mailto:Simone.Campana@cern.ch Simone Campana]<br>[mailto:Alessandro.Di.Girolamo@cern.ch Alessandro Di Girolamo]<br>[mailto:jhover@bnl.gov John Hover]<br>[mailto:Luca.Vaccarossa@mi.infn.it Luca Vaccarossa]<br>[mailto:Elisabetta.Vilucchi@lnf.infn.it Elisabetta Vilucchi]<br>[mailto:jd@bnl.gov John De Stefano Jr]<br>[mailto:james@somewhere.com James Walder]
 +
 
|-
 
|-
 
|[http://lsgc.org/biomed.html biomed]
 
|[http://lsgc.org/biomed.html biomed]
 
|This VO covers the areas related to health and life sciences. Currently, it is divided into 3 sectors: medical imaging, bioinformatics and drug discovery. The VO is openly accessible to academics, and to private company for non-commercial purposes.
 
|This VO covers the areas related to health and life sciences. Currently, it is divided into 3 sectors: medical imaging, bioinformatics and drug discovery. The VO is openly accessible to academics, and to private company for non-commercial purposes.
 
|[mailto:glatard@creatis.insa-lyon.fr Tristan Glatard]<br>[mailto:jerome.pansanel@iphc.cnrs.fr Jerome Pansanel]<br>[mailto:fmichel@i3s.unice.fr Franck Michel]<br>[mailto:sorina.pop@creatis.insa-lyon.fr Sorina Camarasu]<br>[mailto:glatard@creatis.insa-lyon.fr Tristan Glatard]
 
|[mailto:glatard@creatis.insa-lyon.fr Tristan Glatard]<br>[mailto:jerome.pansanel@iphc.cnrs.fr Jerome Pansanel]<br>[mailto:fmichel@i3s.unice.fr Franck Michel]<br>[mailto:sorina.pop@creatis.insa-lyon.fr Sorina Camarasu]<br>[mailto:glatard@creatis.insa-lyon.fr Tristan Glatard]
 +
 
|-
 
|-
 
|[http://cms.cern.ch/iCMS/ cms]
 
|[http://cms.cern.ch/iCMS/ cms]
 
|The Compact Muon Solenoid (CMS) experiment is a large general-purpose particle physics detectors built on the proton-proton Large Hadron Collider (LHC) at CERN in Switzerland.
 
|The Compact Muon Solenoid (CMS) experiment is a large general-purpose particle physics detectors built on the proton-proton Large Hadron Collider (LHC) at CERN in Switzerland.
 
|[mailto:Andreas.Pfeiffer@cern.ch Andreas Pfeiffer]<br>[mailto:stefano.belforte@cern.ch Stefano Belforte]<br>[mailto:stefano.belforte@ts.infn.it Stefano Belforte]<br>[mailto:Daniele.Bonacorsi@bo.infn.it Bonacorsi Daniele]<br>[mailto:Christoph.Wissing@desy.de Christoph Wissing]<br>[mailto:sexton@gmail.com Elizabeth Sexton-Kennedy]<br>[mailto:lammel@fnal.gov Stephan Lammel]<br>[mailto:jose.hernandez@ciemat.es Jose Hernandez]<br>[mailto:Daniele.Bonacorsi@bo.infn.it Bonacorsi Daniele]<br>[mailto:Andrea.Sciaba@cern.ch Andrea Sciaba]<br>[mailto:gutsche@fnal.gov Oliver Gutsche]
 
|[mailto:Andreas.Pfeiffer@cern.ch Andreas Pfeiffer]<br>[mailto:stefano.belforte@cern.ch Stefano Belforte]<br>[mailto:stefano.belforte@ts.infn.it Stefano Belforte]<br>[mailto:Daniele.Bonacorsi@bo.infn.it Bonacorsi Daniele]<br>[mailto:Christoph.Wissing@desy.de Christoph Wissing]<br>[mailto:sexton@gmail.com Elizabeth Sexton-Kennedy]<br>[mailto:lammel@fnal.gov Stephan Lammel]<br>[mailto:jose.hernandez@ciemat.es Jose Hernandez]<br>[mailto:Daniele.Bonacorsi@bo.infn.it Bonacorsi Daniele]<br>[mailto:Andrea.Sciaba@cern.ch Andrea Sciaba]<br>[mailto:gutsche@fnal.gov Oliver Gutsche]
 +
 
|-
 
|-
 
|[http://wiki.egi.eu/wiki/Dteam_vo dteam]
 
|[http://wiki.egi.eu/wiki/Dteam_vo dteam]
 
|The goal of the VO is to facilitate the deployment of a stable production Grid infrastructure. To this end, members of this VO (who have to be associated with a registered site and be involved in its operation) are allowed to run tests to validate the correct configuration of their site. Site performance evaluation and/or monitoring programs may also be run under the DTEAM VO with the approval of the Site Manager, subject to the agreement of the affected sites' management.
 
|The goal of the VO is to facilitate the deployment of a stable production Grid infrastructure. To this end, members of this VO (who have to be associated with a registered site and be involved in its operation) are allowed to run tests to validate the correct configuration of their site. Site performance evaluation and/or monitoring programs may also be run under the DTEAM VO with the approval of the Site Manager, subject to the agreement of the affected sites' management.
 
|[mailto:kkoum@admin.grnet.gr Kostas Koumantaros]<br>[mailto:alessandro.paolini@egi.eu Alessandro Paolini]<br>[mailto:matthew.viljoen@egi.eu Matthew Viljoen]<br>[mailto:kyrginis@admin.grnet.gr Kyriakos Gkinis]<br>[mailto:alessandro.paolini@egi.eu Alessandro Paolini]
 
|[mailto:kkoum@admin.grnet.gr Kostas Koumantaros]<br>[mailto:alessandro.paolini@egi.eu Alessandro Paolini]<br>[mailto:matthew.viljoen@egi.eu Matthew Viljoen]<br>[mailto:kyrginis@admin.grnet.gr Kyriakos Gkinis]<br>[mailto:alessandro.paolini@egi.eu Alessandro Paolini]
 +
 
|-
 
|-
 
|[http://www.euearthsciencegrid.org/content/esr-vo-introduction esr]
 
|[http://www.euearthsciencegrid.org/content/esr-vo-introduction esr]
Line 82: Line 88:
 
   4. Solid Earth Physics
 
   4. Solid Earth Physics
 
|[mailto:andre.gemuend@scai.fraunhofer.de Andre Gemuend]<br>[mailto:weissenb@ccr.jussieu.fr David Weissenbach]<br>[mailto:weissenb@ccr.jussieu.fr David Weissenbach]<br>[mailto:weissenb@ccr.jussieu.fr David Weissenbach]
 
|[mailto:andre.gemuend@scai.fraunhofer.de Andre Gemuend]<br>[mailto:weissenb@ccr.jussieu.fr David Weissenbach]<br>[mailto:weissenb@ccr.jussieu.fr David Weissenbach]<br>[mailto:weissenb@ccr.jussieu.fr David Weissenbach]
 +
 
|-
 
|-
 
|[http://geant4.web.cern.ch/geant4/ geant4]
 
|[http://geant4.web.cern.ch/geant4/ geant4]
 
|Geant4 is a toolkit for the simulation of the passage of particles through matter. Its areas of application include high energy, nuclear and accelerator physics, as well as studies in medical and space science. The two main reference papers for Geant4 are published in Nuclear Instruments and Methods in Physics Research  A 506 (2003) 250-303, and IEEE Transactions on Nuclear Science 53 No. 1 (2006) 270-278.
 
|Geant4 is a toolkit for the simulation of the passage of particles through matter. Its areas of application include high energy, nuclear and accelerator physics, as well as studies in medical and space science. The two main reference papers for Geant4 are published in Nuclear Instruments and Methods in Physics Research  A 506 (2003) 250-303, and IEEE Transactions on Nuclear Science 53 No. 1 (2006) 270-278.
 
|[mailto:Andrea.Sciaba@cern.ch Andrea Sciaba]<br>[mailto:Andrea.Sciaba@cern.ch Andrea Sciaba]<br>[mailto:Andrea.Dotti@cern.ch Andrea Dotti]
 
|[mailto:Andrea.Sciaba@cern.ch Andrea Sciaba]<br>[mailto:Andrea.Sciaba@cern.ch Andrea Sciaba]<br>[mailto:Andrea.Dotti@cern.ch Andrea Dotti]
 +
 
|-
 
|-
 
|[http://lhcb.web.cern.ch/lhcb/ lhcb]
 
|[http://lhcb.web.cern.ch/lhcb/ lhcb]
 
|The LHCb (Large Hadron Collider Beauty) experiment is mainly set on finding the solution to the mistery of the matter-antimatter imbalance in the Universe.
 
|The LHCb (Large Hadron Collider Beauty) experiment is mainly set on finding the solution to the mistery of the matter-antimatter imbalance in the Universe.
 
|[mailto:joel.closier@cern.ch joel Closier]<br>[mailto:andrew.mcnab@cern.ch Andrew McNab]<br>[mailto:concezio.bozzi@cern.ch Concezio Bozzi]<br>[mailto:ben.couturier@cern.ch Ben Couturier]<br>[mailto:joel.closier@cern.ch joel Closier]
 
|[mailto:joel.closier@cern.ch joel Closier]<br>[mailto:andrew.mcnab@cern.ch Andrew McNab]<br>[mailto:concezio.bozzi@cern.ch Concezio Bozzi]<br>[mailto:ben.couturier@cern.ch Ben Couturier]<br>[mailto:joel.closier@cern.ch joel Closier]
 +
 
|-
 
|-
 
|[http://magic.mppmu.mpg.de magic]
 
|[http://magic.mppmu.mpg.de magic]
 
|MAGIC is a system of two imaging atmospheric Cherenkov telescopes (or IACTs). MAGIC-I started routine operation after commissioning in 2004. Construction of MAGIC-II was completed in early 2009, and the two telescopes have been in operation ever since, with a break in 2012 for an upgrade that achieved full homogeneity. The project is funded primarily by the funding agencies BMFB (Germany), MPG (Germany), INFN (Italy), MICINN(Spain), and the ETH Zurich (Switzerland).
 
|MAGIC is a system of two imaging atmospheric Cherenkov telescopes (or IACTs). MAGIC-I started routine operation after commissioning in 2004. Construction of MAGIC-II was completed in early 2009, and the two telescopes have been in operation ever since, with a break in 2012 for an upgrade that achieved full homogeneity. The project is funded primarily by the funding agencies BMFB (Germany), MPG (Germany), INFN (Italy), MICINN(Spain), and the ETH Zurich (Switzerland).
 
|[mailto:neissner@pic.es Neissner Christian]<br>[mailto:contrera@gae.ucm.es Jose Luis Contreras]<br>[mailto:rfirpo@pic.es Roger Firpo]
 
|[mailto:neissner@pic.es Neissner Christian]<br>[mailto:contrera@gae.ucm.es Jose Luis Contreras]<br>[mailto:rfirpo@pic.es Roger Firpo]
 +
 
|-
 
|-
 
|[https://wiki.egi.eu/wiki/OPS_vo ops]
 
|[https://wiki.egi.eu/wiki/OPS_vo ops]
 
|The goal of the VO is to facilitate the operations of the LCG/EGI infrastructure, which includes running official monitoring, re-certification and performance evaluation tools. Additionally the VO will be used for interoperations with other grid infrastructures.
 
|The goal of the VO is to facilitate the operations of the LCG/EGI infrastructure, which includes running official monitoring, re-certification and performance evaluation tools. Additionally the VO will be used for interoperations with other grid infrastructures.
 
|[mailto:eimamagi@srce.hr Emir Imamagic]<br>[mailto:alessandro.paolini@egi.eu Alessandro Paolini]<br>[mailto:vincenzo.spinoso@egi.eu Vincenzo Spinoso]
 
|[mailto:eimamagi@srce.hr Emir Imamagic]<br>[mailto:alessandro.paolini@egi.eu Alessandro Paolini]<br>[mailto:vincenzo.spinoso@egi.eu Vincenzo Spinoso]
 +
 
|-
 
|-
 
|[http://www.t2k.org t2k.org]
 
|[http://www.t2k.org t2k.org]
 
|T2K is a neutrino experiment designed to investigate how neutrinos change from one flavour to another as they travel (neutrino oscillations). An intense beam of muon neutrinos is generated at the J-PARC nuclear physics site on the East coast of Japan and directed across the country to the Super-Kamiokande neutrino detector in the mountains of western Japan. The beam is measured once before it leaves the J-PARC site, using the near detector ND280, and again at Super-K: the change in the measured intensity and composition of the beam is used to provide information on the properties of neutrinos.
 
|T2K is a neutrino experiment designed to investigate how neutrinos change from one flavour to another as they travel (neutrino oscillations). An intense beam of muon neutrinos is generated at the J-PARC nuclear physics site on the East coast of Japan and directed across the country to the Super-Kamiokande neutrino detector in the mountains of western Japan. The beam is measured once before it leaves the J-PARC site, using the near detector ND280, and again at Super-K: the change in the measured intensity and composition of the beam is used to provide information on the properties of neutrinos.
 
|[mailto:lukas.koch@uni-mainz.de Lukas Koch]<br>[mailto:soph.e.king123@gmail.com Sophie King]<br>[mailto:tomislav.vladisavljevic@stfc.ac.uk Tomislav Vladisavljevic]
 
|[mailto:lukas.koch@uni-mainz.de Lukas Koch]<br>[mailto:soph.e.king123@gmail.com Sophie King]<br>[mailto:tomislav.vladisavljevic@stfc.ac.uk Tomislav Vladisavljevic]
 +
 
|-
 
|-
 
|[http://moedal.org vo.moedal.org]
 
|[http://moedal.org vo.moedal.org]
 
|The MoEDAL VO allows members of the MoEDAL Collaboration to perform all of the computing activities relevant for the MoEDAL experiment, making use of available resources according to the policy defined by the Collaboration.
 
|The MoEDAL VO allows members of the MoEDAL Collaboration to perform all of the computing activities relevant for the MoEDAL experiment, making use of available resources according to the policy defined by the Collaboration.
 
|[mailto:t.whyntie@qmul.ac.uk Tom Whyntie]<br>[mailto:daniel.felea@cern.ch Daniel Felea]
 
|[mailto:t.whyntie@qmul.ac.uk Tom Whyntie]<br>[mailto:daniel.felea@cern.ch Daniel Felea]
 
+
<!-- end egi approved list -->
 
|}
 
|}
  
Line 117: Line 129:
 
!Area
 
!Area
 
!Contact
 
!Contact
 +
<!-- start global approved list -->
 
|-
 
|-
 
|[https://twiki.cern.ch/twiki/bin/view/CALICE/ calice]
 
|[https://twiki.cern.ch/twiki/bin/view/CALICE/ calice]
Line 123: Line 136:
 
A high granularity calorimeter optimised for the Particle Flow measurement of multi-jets final state at the International Linear Collider running at a center-of-mass between 90 GeV and 1 TeV.
 
A high granularity calorimeter optimised for the Particle Flow measurement of multi-jets final state at the International Linear Collider running at a center-of-mass between 90 GeV and 1 TeV.
 
|[mailto:poeschl@lal.in2p3.fr Roman  Poeschl]<br>[mailto:Shaojun.Lu@desy.de Shaojun Lu ]<br>[mailto:Andreas.Gellrich@desy.de Andreas Gellrich]
 
|[mailto:poeschl@lal.in2p3.fr Roman  Poeschl]<br>[mailto:Shaojun.Lu@desy.de Shaojun Lu ]<br>[mailto:Andreas.Gellrich@desy.de Andreas Gellrich]
 +
 
|-
 
|-
 
|[http://www.icecube.wisc.edu/ icecube]
 
|[http://www.icecube.wisc.edu/ icecube]
 
|The goal of the VO is to enable the usage of Grid resources for ICECUBE collaboration members, mainly for simulation and reconstruction.
 
|The goal of the VO is to enable the usage of Grid resources for ICECUBE collaboration members, mainly for simulation and reconstruction.
 
|[mailto:dpieloth@physik.uni-dortmund.de Damian Pieloth]
 
|[mailto:dpieloth@physik.uni-dortmund.de Damian Pieloth]
 +
 
|-
 
|-
 
|[http://www-flc.desy.de/flc/ ilc]
 
|[http://www-flc.desy.de/flc/ ilc]
 
|VO for the International Linear Collider Community.
 
|VO for the International Linear Collider Community.
 
|[mailto:Frank.Gaede@desy.de Frank Gaede]<br>[mailto:andreas.gellrich@desy.de Andreas Gellrich]<br>[mailto:Christoph.Wissing@desy.de Christoph Wissing]
 
|[mailto:Frank.Gaede@desy.de Frank Gaede]<br>[mailto:andreas.gellrich@desy.de Andreas Gellrich]<br>[mailto:Christoph.Wissing@desy.de Christoph Wissing]
 +
 
|-
 
|-
 
|[https://voms2.cnaf.infn.it:8443/voms/ipv6.hepix.org/admin/home.action ipv6.hepix.org]
 
|[https://voms2.cnaf.infn.it:8443/voms/ipv6.hepix.org/admin/home.action ipv6.hepix.org]
 
|The goal of the VO is to carry out testing of IPv6 readiness, functionality and performance of the middleware, applications and tools required by the stakeholder communities, especially HEP. Other authorised activities include use of the testbed by related IPv6 activities inside EGI, the related middleware technology providers and other Infrastructures used by WLCG/HEP.
 
|The goal of the VO is to carry out testing of IPv6 readiness, functionality and performance of the middleware, applications and tools required by the stakeholder communities, especially HEP. Other authorised activities include use of the testbed by related IPv6 activities inside EGI, the related middleware technology providers and other Infrastructures used by WLCG/HEP.
 
|[mailto:david.kelsey@stfc.ac.uk david kelsey]
 
|[mailto:david.kelsey@stfc.ac.uk david kelsey]
 +
 
|-
 
|-
 
|[http://www.lsst.org/lsst/ lsst]
 
|[http://www.lsst.org/lsst/ lsst]
 
|Large Synoptic Survey Telescope or LSST is a large aperture wide field survey telescope and 3200 Megapixel camera to image faint astronomical objects, rapidly scan the sky and observe probes for dark matter and dark enegy. LSST Data Management and Simulation jobs will run on OSG and EGI.
 
|Large Synoptic Survey Telescope or LSST is a large aperture wide field survey telescope and 3200 Megapixel camera to image faint astronomical objects, rapidly scan the sky and observe probes for dark matter and dark enegy. LSST Data Management and Simulation jobs will run on OSG and EGI.
 
|[mailto:arnault@lal.in2p3.fr Christian Arnault]<br>[mailto:boutigny@in2p3.fr Dominique Boutigny]<br>[mailto:garzoglio@fnal.gov Gabriele Garzoglio]<br>[mailto:IGoodenow@lsst.org Iain Goodenow]<br>[mailto:fabio@in2p3.fr Fabio Hernandez]
 
|[mailto:arnault@lal.in2p3.fr Christian Arnault]<br>[mailto:boutigny@in2p3.fr Dominique Boutigny]<br>[mailto:garzoglio@fnal.gov Gabriele Garzoglio]<br>[mailto:IGoodenow@lsst.org Iain Goodenow]<br>[mailto:fabio@in2p3.fr Fabio Hernandez]
 +
 
|-
 
|-
 
|[https://na62.gla.ac.uk/ na62.vo.gridpp.ac.uk]
 
|[https://na62.gla.ac.uk/ na62.vo.gridpp.ac.uk]
 
|The NA62 VO (na62.vo.gridpp.ac.uk) is meant to provide grid computing and data storage resources to the NA62 collaboration. The NA62 VO is supported by University of Cambridge, University of Glasgow, Imperial College London, University of Birmingham, University of Lancaster, University of Liverpool, University of Manchester, Oxford University and RAL (from UK), CERN, CNAF (Italy) and UCL (Belgium). More info about the NA62 experiment can be found on http://na62.web.cern.ch/na62/. The production portal is located at http://na62.gla.ac.uk/
 
|The NA62 VO (na62.vo.gridpp.ac.uk) is meant to provide grid computing and data storage resources to the NA62 collaboration. The NA62 VO is supported by University of Cambridge, University of Glasgow, Imperial College London, University of Birmingham, University of Lancaster, University of Liverpool, University of Manchester, Oxford University and RAL (from UK), CERN, CNAF (Italy) and UCL (Belgium). More info about the NA62 experiment can be found on http://na62.web.cern.ch/na62/. The production portal is located at http://na62.gla.ac.uk/
 
|[mailto:Dan.Protopopescu@glasgow.ac.uk Dan Protopopescu]<br>[mailto:David.Britton@glasgow.ac.uk David Britton]
 
|[mailto:Dan.Protopopescu@glasgow.ac.uk Dan Protopopescu]<br>[mailto:David.Britton@glasgow.ac.uk David Britton]
 +
 
|-
 
|-
 
|[https://www.skatelescope.org/the-ska-project/ skatelescope.eu]
 
|[https://www.skatelescope.org/the-ska-project/ skatelescope.eu]
Line 148: Line 167:
 
The vo skatelescope.eu is the vo supporting this project.
 
The vo skatelescope.eu is the vo supporting this project.
 
|[mailto:alessandra.forti@cern.ch Alessandra Forti]<br>[mailto:andrew.mcnab@cern.ch Andrew McNab]<br>[mailto:rohini.joshi@manchester.ac.uk Rohini Joshi]
 
|[mailto:alessandra.forti@cern.ch Alessandra Forti]<br>[mailto:andrew.mcnab@cern.ch Andrew McNab]<br>[mailto:rohini.joshi@manchester.ac.uk Rohini Joshi]
 +
 
|-
 
|-
 
|[http://www-zeus.desy.de/ zeus]
 
|[http://www-zeus.desy.de/ zeus]
 
|ZEUS is a collaboration of about 450 physicists who are running a large particle detector at the electron-proton collider HERA at the DESY laboratory in Hamburg. The ZEUS detector is a sophisticated tool for studying the particle reactions provided by the high-energetic beams of the HERA accelerator. Thus the participating scientists are pushing forward our knowledge of the fundamental particles and forces of nature, gaining unsurpassed insight into the exciting laws of the microcosm.
 
|ZEUS is a collaboration of about 450 physicists who are running a large particle detector at the electron-proton collider HERA at the DESY laboratory in Hamburg. The ZEUS detector is a sophisticated tool for studying the particle reactions provided by the high-energetic beams of the HERA accelerator. Thus the participating scientists are pushing forward our knowledge of the fundamental particles and forces of nature, gaining unsurpassed insight into the exciting laws of the microcosm.
 
|[mailto:Andreas.Gellrich@desy.de Andreas Gellrich]
 
|[mailto:Andreas.Gellrich@desy.de Andreas Gellrich]
 +
<!-- end global approved list -->
  
 
|}
 
|}
Line 163: Line 184:
 
!Area
 
!Area
 
!Contact
 
!Contact
 +
<!-- start local approved list -->
 
|-
 
|-
 
|[http://researchinschools.org/CERN/ cernatschool.org]
 
|[http://researchinschools.org/CERN/ cernatschool.org]
 
|The CERN@school VO represents the CERN@school project on the Grid. CERN@school aims to bring CERN technology into the classroom to aid with the teaching of physics and to inspire the next generation of scientists and engineers. The CERN@school VO will allow students and teachers involved with the project to harness GridPP to store and analyse data from the CERN@school detectors, the LUCID experiment and the associated GEANT4 simulations.
 
|The CERN@school VO represents the CERN@school project on the Grid. CERN@school aims to bring CERN technology into the classroom to aid with the teaching of physics and to inspire the next generation of scientists and engineers. The CERN@school VO will allow students and teachers involved with the project to harness GridPP to store and analyse data from the CERN@school detectors, the LUCID experiment and the associated GEANT4 simulations.
 
|
 
|
 +
 
|-
 
|-
 
|[http://www.sruc.ac.uk/epic/ epic.vo.gridpp.ac.uk]
 
|[http://www.sruc.ac.uk/epic/ epic.vo.gridpp.ac.uk]
Line 173: Line 196:
 
The overarching purpose for the Centre is to provide access to high quality advice and analyses on the epidemiology and control of animal diseases that are important to Scotland, and to best prepare Scotland for the next major disease incursion. Ultimately, this strategic advice to the Scottish Government will help ensure that the interests of the various stakeholders involved in disease emergency planning and response are met as effectively as possible. This all must be achieved within the context of our rapidly changing environment. For example, issues such as climate change are now influencing the livestock disease risks that Scotland faces.
 
The overarching purpose for the Centre is to provide access to high quality advice and analyses on the epidemiology and control of animal diseases that are important to Scotland, and to best prepare Scotland for the next major disease incursion. Ultimately, this strategic advice to the Scottish Government will help ensure that the interests of the various stakeholders involved in disease emergency planning and response are met as effectively as possible. This all must be achieved within the context of our rapidly changing environment. For example, issues such as climate change are now influencing the livestock disease risks that Scotland faces.
 
|[mailto:thomas.doherty@glasgow.ac.uk thomas doherty]
 
|[mailto:thomas.doherty@glasgow.ac.uk thomas doherty]
 +
 
|-
 
|-
 
|[http://www.gridpp.ac.uk gridpp]
 
|[http://www.gridpp.ac.uk gridpp]
 
|GridPP is a collaboration of particle physicists and computer scientists from the UK and CERN. They are building a distributed computing Grid across the UK for particle physicists. At the moment there is a working particle physics Grid across 17 UK institutions.
 
|GridPP is a collaboration of particle physicists and computer scientists from the UK and CERN. They are building a distributed computing Grid across the UK for particle physicists. At the moment there is a working particle physics Grid across 17 UK institutions.
 
|[mailto:J.Coles@rl.ac.uk Coles Jeremy]
 
|[mailto:J.Coles@rl.ac.uk Coles Jeremy]
 +
 
|-
 
|-
 
|[http://www.hyperk.org hyperk.org]
 
|[http://www.hyperk.org hyperk.org]
 
|We propose the Hyper-Kamiokande (Hyper-K) detector as a next generation underground water Cherenkov detector. It will serve as a far detector of a long baseline neutrino oscillation experiment envisioned for the upgraded J-PARC, and as a detector capable of observing -- far beyond the sensitivity of the Super-Kamiokande (Super-K) detector -- proton decays, atmospheric neutrinos, and neutrinos from astronomical origins. The baseline design of Hyper-K is based on the highly successful Super-K, taking full advantage of a well-proven technology. "
 
|We propose the Hyper-Kamiokande (Hyper-K) detector as a next generation underground water Cherenkov detector. It will serve as a far detector of a long baseline neutrino oscillation experiment envisioned for the upgraded J-PARC, and as a detector capable of observing -- far beyond the sensitivity of the Super-Kamiokande (Super-K) detector -- proton decays, atmospheric neutrinos, and neutrinos from astronomical origins. The baseline design of Hyper-K is based on the highly successful Super-K, taking full advantage of a well-proven technology. "
 
|[mailto:C.J.Walker@qmul.ac.uk Christopher Walker]<br>[mailto:francesca.di_lodovico@kcl.ac.uk Francesca di lodovico]
 
|[mailto:C.J.Walker@qmul.ac.uk Christopher Walker]<br>[mailto:francesca.di_lodovico@kcl.ac.uk Francesca di lodovico]
 +
 
|-
 
|-
 
|[http://www.mice.iit.edu/ mice]
 
|[http://www.mice.iit.edu/ mice]
 
|A VO to support the activities of the Muon Ionisation Cooling Experiment (MICE). Specifically it is to enable the moving of MICE data around the Grid followed by the submission of analysis to these data. This is expected to be a small VO.
 
|A VO to support the activities of the Muon Ionisation Cooling Experiment (MICE). Specifically it is to enable the moving of MICE data around the Grid followed by the submission of analysis to these data. This is expected to be a small VO.
 
|[mailto:d.colling@imperial.ac.uk David Colling]<br>[mailto:p.hodgson@sheffield.ac.uk Paul Hodgson]<br>[mailto:daniela.bauer@imperial.ac.uk Daniela Bauer]<br>[mailto:janusz.martyniak@imperial.ac.uk Janusz Martyniak]
 
|[mailto:d.colling@imperial.ac.uk David Colling]<br>[mailto:p.hodgson@sheffield.ac.uk Paul Hodgson]<br>[mailto:daniela.bauer@imperial.ac.uk Daniela Bauer]<br>[mailto:janusz.martyniak@imperial.ac.uk Janusz Martyniak]
 +
 
|-
 
|-
 
|[http://www.phenogrid.dur.ac.uk/ pheno]
 
|[http://www.phenogrid.dur.ac.uk/ pheno]
 
|Phenogrid is the VO for UK theorists that don't fit within one of the LHC experiments (e.g. developers of Monte Carlos etc.) The rest of this text exists only to satisfy the extremely unnecessary minimum limit of 200 characters.
 
|Phenogrid is the VO for UK theorists that don't fit within one of the LHC experiments (e.g. developers of Monte Carlos etc.) The rest of this text exists only to satisfy the extremely unnecessary minimum limit of 200 characters.
 
|[mailto:jeppe.andersen@durham.ac.uk Jeppe Andersen]<br>[mailto:adam.j.boutcher@durham.ac.uk Adam Boutcher]<br>[mailto:paul.clark@durham.ac.uk Paul Clark]
 
|[mailto:jeppe.andersen@durham.ac.uk Jeppe Andersen]<br>[mailto:adam.j.boutcher@durham.ac.uk Adam Boutcher]<br>[mailto:paul.clark@durham.ac.uk Paul Clark]
 +
 
|-
 
|-
 
|[https://snoplus.phy.queensu.ca/ snoplus.snolab.ca]
 
|[https://snoplus.phy.queensu.ca/ snoplus.snolab.ca]
 
|VO for the snoplus experiment, a multi-purpose liquid scintillator neutrino experiment based in Sudbury, Canada. Members of the snoplus virtual organisation will contribute to the European computing effort to accurately simulate the SNOplus detector response.
 
|VO for the snoplus experiment, a multi-purpose liquid scintillator neutrino experiment based in Sudbury, Canada. Members of the snoplus virtual organisation will contribute to the European computing effort to accurately simulate the SNOplus detector response.
 
|[mailto:Jeanne.wilson@kcl.ac.uk Jeanne Wilson]<br>[mailto:C.J.Walker@qmul.ac.uk Christopher Walker]<br>[mailto:m.mottram@qmul.ac.uk Matthew Mottram]
 
|[mailto:Jeanne.wilson@kcl.ac.uk Jeanne Wilson]<br>[mailto:C.J.Walker@qmul.ac.uk Christopher Walker]<br>[mailto:m.mottram@qmul.ac.uk Matthew Mottram]
 +
 
|-
 
|-
 
|[http://www.imperial.ac.uk/high-energy-physics/research/experiments/solid/ solidexperiment.org]
 
|[http://www.imperial.ac.uk/high-energy-physics/research/experiments/solid/ solidexperiment.org]
 
|support grid user of the SoLid experiment.
 
|support grid user of the SoLid experiment.
 
|[mailto:daniela.bauer@imperial.ac.uk Daniela Bauer]<br>[mailto:antonin.vacheret@imperial.ac.uk antonin vacheret]
 
|[mailto:daniela.bauer@imperial.ac.uk Daniela Bauer]<br>[mailto:antonin.vacheret@imperial.ac.uk antonin vacheret]
 +
 
|-
 
|-
 
|[https://voms.gridpp.ac.uk:8443/voms/vo.northgrid.ac.uk vo.northgrid.ac.uk]
 
|[https://voms.gridpp.ac.uk:8443/voms/vo.northgrid.ac.uk vo.northgrid.ac.uk]
 
|Regional Virtual Organisation created to allow access to HEP resources to other local disciplines from Northgrid sites: Manchester, Lancaster, Liverpool, Sheffield. Users from these universities can apply.
 
|Regional Virtual Organisation created to allow access to HEP resources to other local disciplines from Northgrid sites: Manchester, Lancaster, Liverpool, Sheffield. Users from these universities can apply.
 
|[mailto:alessandra.forti@cern.ch Alessandra Forti]<br>[mailto:robert.frank@manchester.ac.uk Robert Frank]<br>[mailto:robert.frank@manchester.ac.uk Robert Frank]
 
|[mailto:alessandra.forti@cern.ch Alessandra Forti]<br>[mailto:robert.frank@manchester.ac.uk Robert Frank]<br>[mailto:robert.frank@manchester.ac.uk Robert Frank]
 +
 
|-
 
|-
 
|[http://www.scotgrid.ac.uk/ vo.scotgrid.ac.uk]
 
|[http://www.scotgrid.ac.uk/ vo.scotgrid.ac.uk]
 
|The VO is for academic and other users in Scotland to test access to EGI resources. Users will join this VO before deciding whether to setup one of their own for long term access. It is also designed as a test VO to allow maintenance and operational testing of site services.
 
|The VO is for academic and other users in Scotland to test access to EGI resources. Users will join this VO before deciding whether to setup one of their own for long term access. It is also designed as a test VO to allow maintenance and operational testing of site services.
 
|[mailto:garth.roy@glasgow.ac.uk Gareth Roy]
 
|[mailto:garth.roy@glasgow.ac.uk Gareth Roy]
 +
 
|-
 
|-
 
|[http://www.southgrid.ac.uk/VO/ vo.southgrid.ac.uk]
 
|[http://www.southgrid.ac.uk/VO/ vo.southgrid.ac.uk]
Line 210: Line 242:
 
whether to setup one of their own for long term access.
 
whether to setup one of their own for long term access.
 
|[mailto:gronbech@physics.ox.ac.uk Peter Gronbech]
 
|[mailto:gronbech@physics.ox.ac.uk Peter Gronbech]
 
+
<!-- end local approved list -->
 
|}
 
|}
  
Line 223: Line 255:
 
!Area
 
!Area
 
!Contact
 
!Contact
 +
<!-- start other approved list -->
 
|-
 
|-
 
|[http://wiki.grid.auth.gr/wiki/bin/view/ComplexityScienceSSC/VO vo.complex-systems.eu]
 
|[http://wiki.grid.auth.gr/wiki/bin/view/ComplexityScienceSSC/VO vo.complex-systems.eu]
Line 231: Line 264:
 
focusing on the research area of Complexity Science.
 
focusing on the research area of Complexity Science.
 
|[mailto:romain.reuillon@iscpif.fr Romain Reuillon]
 
|[mailto:romain.reuillon@iscpif.fr Romain Reuillon]
 +
 
|-
 
|-
 
|[http://comet.kek.jp comet.j-parc.jp]
 
|[http://comet.kek.jp comet.j-parc.jp]
 
|Muon-to-electron conversion experiment at J-PARC, which will be used by international COMET collaborators for design studies and data analysis. COMET will test Beyond-the-Standard-Model physics in a way that is complementary to the experiments at the LHC.
 
|Muon-to-electron conversion experiment at J-PARC, which will be used by international COMET collaborators for design studies and data analysis. COMET will test Beyond-the-Standard-Model physics in a way that is complementary to the experiments at the LHC.
 
|[mailto:daniela.bauer@imperial.ac.uk Daniela Bauer]<br>[mailto:Yoshi.Uchida@imperial.ac.uk Yoshi Uchida]<br>[mailto:simon.fayer05@imperial.ac.uk Simon Fayer]
 
|[mailto:daniela.bauer@imperial.ac.uk Daniela Bauer]<br>[mailto:Yoshi.Uchida@imperial.ac.uk Yoshi Uchida]<br>[mailto:simon.fayer05@imperial.ac.uk Simon Fayer]
 +
 
|-
 
|-
 
|[https://portal.cta-observatory.org/Pages/Home.aspx vo.cta.in2p3.fr]
 
|[https://portal.cta-observatory.org/Pages/Home.aspx vo.cta.in2p3.fr]
Line 240: Line 275:
 
international consortium.
 
international consortium.
 
|[mailto:cecile.barbier@lapp.in2p3.fr Cecile Barbier]<br>[mailto:arrabito@in2p3.fr Luisa Arrabito]
 
|[mailto:cecile.barbier@lapp.in2p3.fr Cecile Barbier]<br>[mailto:arrabito@in2p3.fr Luisa Arrabito]
 +
 
|-
 
|-
 
|[http://www.dunescience.org dune]
 
|[http://www.dunescience.org dune]
 
|DUNE is the Deep Underground Neutrino Experiment managed by the global DUNE collaboration and hosted at Fermilab.  We are building a deep-underground Liquid-Argon based neutrino detector to study accelerator-based neutrino oscillations, supernova neutrinos, and nucleon decay.
 
|DUNE is the Deep Underground Neutrino Experiment managed by the global DUNE collaboration and hosted at Fermilab.  We are building a deep-underground Liquid-Argon based neutrino detector to study accelerator-based neutrino oscillations, supernova neutrinos, and nucleon decay.
 
|[mailto:andrew.mcnab@cern.ch Andrew McNab]<br>[mailto:timm@fnal.gov Steve Timm]
 
|[mailto:andrew.mcnab@cern.ch Andrew McNab]<br>[mailto:timm@fnal.gov Steve Timm]
 +
 
|-
 
|-
 
|[http://www.wenmr.eu enmr.eu]
 
|[http://www.wenmr.eu enmr.eu]
 
|Structural biology and life sciences in general, and NMR in particular, have always been associated with advanced computing. The current challenges in the post-genomic era call for virtual research platforms that provide the worldwide research community with both user-friendly tools, platforms for data analysis and exchange, and an underlying e-infrastructure. WeNMR groups different research teams into a worldwide virtual research community. It builds on the established eNMR e-Infrastructure and its steadily growing virtual organization, which is currently the second largest VO in the area of life sciences. WeNMR provides an e-Infrastructure platform and Science Gateway for structural biology towards EGI for the users of existing infrastructures. It involves researchers from around the world and will build bridges to other areas of structural biology. Integration with SAXS, a rapidly growing and highly complementary method, is directly included in WeNMR, but links will also be established to related initiatives. WeNMR will serve all relevant INSTRUCT communities in line with the ESFRI roadmap.
 
|Structural biology and life sciences in general, and NMR in particular, have always been associated with advanced computing. The current challenges in the post-genomic era call for virtual research platforms that provide the worldwide research community with both user-friendly tools, platforms for data analysis and exchange, and an underlying e-infrastructure. WeNMR groups different research teams into a worldwide virtual research community. It builds on the established eNMR e-Infrastructure and its steadily growing virtual organization, which is currently the second largest VO in the area of life sciences. WeNMR provides an e-Infrastructure platform and Science Gateway for structural biology towards EGI for the users of existing infrastructures. It involves researchers from around the world and will build bridges to other areas of structural biology. Integration with SAXS, a rapidly growing and highly complementary method, is directly included in WeNMR, but links will also be established to related initiatives. WeNMR will serve all relevant INSTRUCT communities in line with the ESFRI roadmap.
 
|[mailto:Marco.Verlato@pd.infn.it Marco Verlato]<br>[mailto:a.m.j.j.bonvin@uu.nl Alexandre Bonvin]<br>[mailto:rosato@cerm.unifi.it Antonio Rosato]<br>[mailto:h.jonker@nmr.uni-frankfurt.de Henry Jonker]<br>[mailto:giachetti@cerm.unifi.it Andrea Giachetti]<br>[mailto:verlato@infn.it Marco Verlato]
 
|[mailto:Marco.Verlato@pd.infn.it Marco Verlato]<br>[mailto:a.m.j.j.bonvin@uu.nl Alexandre Bonvin]<br>[mailto:rosato@cerm.unifi.it Antonio Rosato]<br>[mailto:h.jonker@nmr.uni-frankfurt.de Henry Jonker]<br>[mailto:giachetti@cerm.unifi.it Andrea Giachetti]<br>[mailto:verlato@infn.it Marco Verlato]
 +
 
|-
 
|-
 
|[http://www.fnal.gov fermilab]
 
|[http://www.fnal.gov fermilab]
 
|Fermilab Virtual Organization (VO) - The Fermilab VO is an "umbrella" VO that includes the Fermilab Campus Grid (FermiGrid) and Fermilab Grid Testing (ITB) infrastructures, and all Fermilab computing activities that are not big enough to have their own Virtual Organization. Broadly these include the intensity frontier program, theoretical simulations, fixed target analysis, and accelerator and beamline design as well as activities performed by the Fermilab Campus Grid administrators.
 
|Fermilab Virtual Organization (VO) - The Fermilab VO is an "umbrella" VO that includes the Fermilab Campus Grid (FermiGrid) and Fermilab Grid Testing (ITB) infrastructures, and all Fermilab computing activities that are not big enough to have their own Virtual Organization. Broadly these include the intensity frontier program, theoretical simulations, fixed target analysis, and accelerator and beamline design as well as activities performed by the Fermilab Campus Grid administrators.
 
|[mailto:garzoglio@fnal.gov Gabriele Garzoglio]<br>[mailto:boyd@fnal.gov Joe Boyd]
 
|[mailto:garzoglio@fnal.gov Gabriele Garzoglio]<br>[mailto:boyd@fnal.gov Joe Boyd]
 +
 
|-
 
|-
 
|[http://lz.lbl.gov/ lz]
 
|[http://lz.lbl.gov/ lz]
 
|This VO will support LUX Zeplin experiment designed to search Dark Matter.
 
|This VO will support LUX Zeplin experiment designed to search Dark Matter.
 
|[mailto:dasu@hep.wisc.edu Sridhara Dasu]<br>[mailto:dan@physics.wisc.edu Daniel Bradley]<br>[mailto:covuosalo@wisc.edu Carl Vuosalo]<br>[mailto:E.Korolkova@sheffield.ac.uk Elena Korolkova]<br>[mailto:j.dobson@ucl.ac.uk James Dobson]
 
|[mailto:dasu@hep.wisc.edu Sridhara Dasu]<br>[mailto:dan@physics.wisc.edu Daniel Bradley]<br>[mailto:covuosalo@wisc.edu Carl Vuosalo]<br>[mailto:E.Korolkova@sheffield.ac.uk Elena Korolkova]<br>[mailto:j.dobson@ucl.ac.uk James Dobson]
 +
 
|-
 
|-
 
|[http://wwwcascina.virgo.infn.it/ virgo]
 
|[http://wwwcascina.virgo.infn.it/ virgo]
Line 262: Line 302:
 
VO target: to allow data management and computationally intensive data analysis
 
VO target: to allow data management and computationally intensive data analysis
 
|[mailto:cristiano.palomba@roma1.infn.it Cristiano Palomba]<br>[mailto:alberto.colla@roma1.infn.it Alberto  Colla]
 
|[mailto:cristiano.palomba@roma1.infn.it Cristiano Palomba]<br>[mailto:alberto.colla@roma1.infn.it Alberto  Colla]
 +
 
|-
 
|-
 
|[http://mossaic.org/ vo.landslides.mossaic.org]
 
|[http://mossaic.org/ vo.landslides.mossaic.org]
 
|A virtual organisation for landslide modellers associated with the Management of Slope Stability in Communities (MoSSaiC) project. The VO is used for running landslide modelling software such as CHASM and QUESTA.
 
|A virtual organisation for landslide modellers associated with the Management of Slope Stability in Communities (MoSSaiC) project. The VO is used for running landslide modelling software such as CHASM and QUESTA.
 
|[mailto:l.kreczko@bristol.ac.uk Lukasz Kreczko]
 
|[mailto:l.kreczko@bristol.ac.uk Lukasz Kreczko]
 
+
<!-- end other approved list -->
 
|}
 
|}
  
Line 279: Line 320:
 
!Area
 
!Area
 
!Contact
 
!Contact
 
+
<!-- start new approved list --><!-- end new approved list -->
 
|}
 
|}
  
Line 400: Line 441:
  
  
<!-- START OF SIDSECTION -->
+
<!-- START OF SIDSECTION -->{{BOX VO|ALICE|<!-- VOMS RECORDS for ALICE -->
{{BOX VO|ALICE|<!-- VOMS RECORDS for ALICE -->
+
 
''' Filename: ''' /etc/grid-security/vomsdir/alice/lcg-voms2.cern.ch.lsc
 
''' Filename: ''' /etc/grid-security/vomsdir/alice/lcg-voms2.cern.ch.lsc
 
<pre><nowiki>
 
<pre><nowiki>
Line 672: Line 712:
 
<pre><nowiki>
 
<pre><nowiki>
 
"gridpp" "voms03.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "gridpp"
 
"gridpp" "voms03.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "gridpp"
 +
</nowiki></pre>
 +
 +
Notes:
 +
n/a
 +
}}
 +
 +
 +
 +
{{BOX VO|ICECUBE|<!-- VOMS RECORDS for ICECUBE -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/icecube/grid-voms.desy.de.lsc
 +
<pre><nowiki>
 +
/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
 +
/C=DE/O=GermanGrid/CN=GridKa-CA
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/icecube-grid-voms.desy.de
 +
<pre><nowiki>
 +
"icecube" "grid-voms.desy.de" "15106" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "icecube"
 
</nowiki></pre>
 
</nowiki></pre>
  
Line 875: Line 933:
 
<pre><nowiki>
 
<pre><nowiki>
 
"t2k.org" "voms03.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "t2k.org"
 
"t2k.org" "voms03.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "t2k.org"
 +
</nowiki></pre>
 +
 +
Notes:
 +
n/a
 +
}}
 +
 +
 +
 +
{{BOX VO|VIRGO|<!-- VOMS RECORDS for VIRGO -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/virgo/voms-01.pd.infn.it.lsc
 +
<pre><nowiki>
 +
/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms-01.pd.infn.it
 +
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/virgo/voms.cnaf.infn.it.lsc
 +
<pre><nowiki>
 +
/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms.cnaf.infn.it
 +
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/virgo-voms-01.pd.infn.it
 +
<pre><nowiki>
 +
"virgo" "voms-01.pd.infn.it" "15009" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms-01.pd.infn.it" "virgo"
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/virgo-voms.cnaf.infn.it
 +
<pre><nowiki>
 +
"virgo" "voms.cnaf.infn.it" "15009" "/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms.cnaf.infn.it" "virgo"
 +
</nowiki></pre>
 +
 +
Notes:
 +
n/a
 +
}}
 +
 +
 +
 +
{{BOX VO|VO.COMPLEX-SYSTEMS.EU|<!-- VOMS RECORDS for VO.COMPLEX-SYSTEMS.EU -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.complex-systems.eu/voms2.hellasgrid.gr.lsc
 +
<pre><nowiki>
 +
/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr
 +
/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/vo.complex-systems.eu-voms2.hellasgrid.gr
 +
<pre><nowiki>
 +
"vo.complex-systems.eu" "voms2.hellasgrid.gr" "15160" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "vo.complex-systems.eu"
 +
</nowiki></pre>
 +
 +
Notes:
 +
n/a
 +
}}
 +
 +
 +
 +
{{BOX VO|VO.CTA.IN2P3.FR|<!-- VOMS RECORDS for VO.CTA.IN2P3.FR -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.cta.in2p3.fr/cclcgvomsli01.in2p3.fr.lsc
 +
<pre><nowiki>
 +
/O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr
 +
/C=FR/O=MENESR/OU=GRID-FR/CN=AC GRID-FR Services
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/vo.cta.in2p3.fr-cclcgvomsli01.in2p3.fr
 +
<pre><nowiki>
 +
"vo.cta.in2p3.fr" "cclcgvomsli01.in2p3.fr" "15008" "/O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr" "vo.cta.in2p3.fr"
 
</nowiki></pre>
 
</nowiki></pre>
  
Line 955: Line 1,078:
 
<pre><nowiki>
 
<pre><nowiki>
 
"vo.scotgrid.ac.uk" "voms03.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.scotgrid.ac.uk"
 
"vo.scotgrid.ac.uk" "voms03.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.scotgrid.ac.uk"
 +
</nowiki></pre>
 +
 +
Notes:
 +
n/a
 +
}}
 +
 +
 +
 +
{{BOX VO|VO.SOUTHGRID.AC.UK|<!-- VOMS RECORDS for VO.SOUTHGRID.AC.UK -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms02.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms03.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/vo.southgrid.ac.uk-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"vo.southgrid.ac.uk" "voms.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.southgrid.ac.uk"
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/vo.southgrid.ac.uk-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"vo.southgrid.ac.uk" "voms02.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.southgrid.ac.uk"
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/vo.southgrid.ac.uk-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"vo.southgrid.ac.uk" "voms03.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.southgrid.ac.uk"
 
</nowiki></pre>
 
</nowiki></pre>
  
Line 1,111: Line 1,274:
 
<pre><nowiki>
 
<pre><nowiki>
 
"na62.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"
 
"na62.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"
 +
</nowiki></pre>
 +
 +
Notes:
 +
n/a
 +
}}
 +
 +
 +
 +
{{BOX VO|COMET.J-PARC.JP|<!-- VOMS RECORDS for COMET.J-PARC.JP -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/comet.j-parc.jp/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/comet.j-parc.jp/voms02.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/comet.j-parc.jp/voms03.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/comet.j-parc.jp-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"comet.j-parc.jp" "voms.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "comet.j-parc.jp"
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/comet.j-parc.jp-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"comet.j-parc.jp" "voms02.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "comet.j-parc.jp"
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/comet.j-parc.jp-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"comet.j-parc.jp" "voms03.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "comet.j-parc.jp"
 
</nowiki></pre>
 
</nowiki></pre>
  
Line 1,209: Line 1,412:
 
<pre><nowiki>
 
<pre><nowiki>
 
"hyperk.org" "voms03.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "hyperk.org"
 
"hyperk.org" "voms03.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "hyperk.org"
 +
</nowiki></pre>
 +
 +
Notes:
 +
n/a
 +
}}
 +
 +
 +
 +
{{BOX VO|CERNATSCHOOL.ORG|<!-- VOMS RECORDS for CERNATSCHOOL.ORG -->
 +
''' Filename: ''' /etc/grid-security/vomsdir/cernatschool.org/voms.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/cernatschool.org/voms02.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/grid-security/vomsdir/cernatschool.org/voms03.gridpp.ac.uk.lsc
 +
<pre><nowiki>
 +
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 +
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/cernatschool.org-voms.gridpp.ac.uk
 +
<pre><nowiki>
 +
"cernatschool.org" "voms.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "cernatschool.org"
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/cernatschool.org-voms02.gridpp.ac.uk
 +
<pre><nowiki>
 +
"cernatschool.org" "voms02.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "cernatschool.org"
 +
</nowiki></pre>
 +
 +
''' Filename: ''' /etc/vomses/cernatschool.org-voms03.gridpp.ac.uk
 +
<pre><nowiki>
 +
"cernatschool.org" "voms03.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "cernatschool.org"
 
</nowiki></pre>
 
</nowiki></pre>
  
Line 1,269: Line 1,512:
 
</nowiki></pre>
 
</nowiki></pre>
  
 +
Notes:
 +
n/a
 +
}}
 +
 +
 +
 +
{{BOX VO|LZ|<!-- VOMS RECORDS for LZ -->
 
Notes:
 
Notes:
 
n/a
 
n/a
Line 1,344: Line 1,594:
  
  
{{BOX VO|SUPERNEMO.ORG|<!-- VOMS RECORDS for SUPERNEMO.ORG -->
+
{{BOX VO|SOLIDEXPERIMENT.ORG|<!-- VOMS RECORDS for SOLIDEXPERIMENT.ORG -->
''' Filename: ''' /etc/grid-security/vomsdir/supernemo.org/voms.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/solidexperiment.org/voms.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
Line 1,351: Line 1,601:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/supernemo.org/voms02.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/solidexperiment.org/voms02.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
Line 1,357: Line 1,607:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/grid-security/vomsdir/supernemo.org/voms03.gridpp.ac.uk.lsc
+
''' Filename: ''' /etc/grid-security/vomsdir/solidexperiment.org/voms03.gridpp.ac.uk.lsc
 
<pre><nowiki>
 
<pre><nowiki>
 
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
 
/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
Line 1,363: Line 1,613:
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/supernemo.org-voms.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/solidexperiment.org-voms.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"supernemo.org" "voms.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "supernemo.org"
+
"solidexperiment.org" "voms.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "solidexperiment.org"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/supernemo.org-voms02.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/solidexperiment.org-voms02.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"supernemo.org" "voms02.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "supernemo.org"
+
"solidexperiment.org" "voms02.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "solidexperiment.org"
 
</nowiki></pre>
 
</nowiki></pre>
  
''' Filename: ''' /etc/vomses/supernemo.org-voms03.gridpp.ac.uk
+
''' Filename: ''' /etc/vomses/solidexperiment.org-voms03.gridpp.ac.uk
 
<pre><nowiki>
 
<pre><nowiki>
"supernemo.org" "voms03.gridpp.ac.uk" "15515" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "supernemo.org"
+
"solidexperiment.org" "voms03.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "solidexperiment.org"
 
</nowiki></pre>
 
</nowiki></pre>
  
Line 1,399: Line 1,649:
  
  
<!-- START OF RESOURCES -->
+
<!-- START OF RESOURCES -->{|border="1" cellpadding="3" style="border-collapse:collapse;margin-bottom:40px;"
{|border="1" cellpadding="3" style="border-collapse:collapse;margin-bottom:40px;"
+
 
<!-- |+VO Resource Requirements -->
 
<!-- |+VO Resource Requirements -->
 
|-style="background:#7C8AAF;color:white"
 
|-style="background:#7C8AAF;color:white"
Line 1,458: Line 1,707:
 
|CVMFS is used for the software distribution via:
 
|CVMFS is used for the software distribution via:
  
:/cvmfs/calice.desy.de
+
:/cvmfs/calice.desy.de  
  
 
For setup instructions refer to:
 
For setup instructions refer to:
Line 1,475: Line 1,724:
 
of virtual address space more per core than memory to accommodate
 
of virtual address space more per core than memory to accommodate
 
the large amount of shared libraries used by jobs.
 
the large amount of shared libraries used by jobs.
(For a typical 8-core pilot that would translate into a VZSIZE limit of at least 64GB.)
+
(For a typical 8-core pilot that would translate into a VZSIZE limit of at least 64GB.)  
  
 
Cloud resources should provision 8-core VMs to match standard 8-core pilots.
 
Cloud resources should provision 8-core VMs to match standard 8-core pilots.
Line 1,493: Line 1,742:
 
|-
 
|-
 
|dteam
 
|dteam
 +
|0
 
|None
 
|None
 
|None
 
|None
|None
+
|0
|None
+
 
|
 
|
 
|-
 
|-
Line 1,506: Line 1,755:
 
|
 
|
 
# For COVID-19 related jobs, slots with 8 GB/Core are required
 
# For COVID-19 related jobs, slots with 8 GB/Core are required
 +
 
# WeNMR software area must be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS and https://wiki.egi.eu/wiki/PROC22. Please do not forget to define on all WNs the environment variable VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu, as pointed out in the above documents.
 
# WeNMR software area must be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS and https://wiki.egi.eu/wiki/PROC22. Please do not forget to define on all WNs the environment variable VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu, as pointed out in the above documents.
 +
 
# The line:
 
# The line:
 
"/enmr.eu/*"::::
 
"/enmr.eu/*"::::
has to be added to group.conf file before configuring via yaim the grid services.
+
has to be added to group.conf file before configuring via yaim the grid services.  
 
In the CREAM-CE this reflects in the lines:
 
In the CREAM-CE this reflects in the lines:
 
"/enmr.eu/*/Role=NULL/Capability=NULL" .enmr
 
"/enmr.eu/*/Role=NULL/Capability=NULL" .enmr
Line 1,552: Line 1,803:
 
|0
 
|0
 
|
 
|
 +
|-
 +
|icecube
 +
|4000
 +
|2880
 +
|2880
 +
|40000
 +
|CVMFS is used for the software distribution via:
 +
 +
/cvmfs/icecube.opensciencegrid.org
 
|-
 
|-
 
|ilc
 
|ilc
Line 1,560: Line 1,820:
 
|CVMFS is used for the software distribution via:
 
|CVMFS is used for the software distribution via:
  
:/cvmfs/ilc.desy.de
+
:/cvmfs/ilc.desy.de  
  
 
For setup instructions refer to:
 
For setup instructions refer to:
Line 1,575: Line 1,835:
 
The amount of memory in the field "Max used physical non-swap X86_64 memory size" of the resources section is understood to be the virtual memory required per single process of a LHCb payload. Usually LHCb payloads consist of one "worker process", consuming the majority of memory, and several wrapper processes. The total amount of virtual memory for all wrapper processes accounts for 1 GB which needs to be added as a requirement to the field  "Max used physical non-swap X86_64 memory size" in case the virtual memory of the whole process tree is monitored.
 
The amount of memory in the field "Max used physical non-swap X86_64 memory size" of the resources section is understood to be the virtual memory required per single process of a LHCb payload. Usually LHCb payloads consist of one "worker process", consuming the majority of memory, and several wrapper processes. The total amount of virtual memory for all wrapper processes accounts for 1 GB which needs to be added as a requirement to the field  "Max used physical non-swap X86_64 memory size" in case the virtual memory of the whole process tree is monitored.
  
The amount of space in field "Max size of scratch space used by jobs", shall be interpreted as 50 % each for downloaded input files and produced output files.
+
The amount of space in field "Max size of scratch space used by jobs", shall be interpreted as 50 % each for downloaded input files and produced output files.  
  
Sites should have the Centos7 or "Cern Centos7" operating system, or later versions, installed on their worker nodes. Sites are requested to provide support for singularity containers and user namespaces. The latter can be checked by ensuring that /proc/sys/user/max_user_namespaces contains a large number.
+
Sites should have the Centos7 or "Cern Centos7" operating system, or later versions, installed on their worker nodes. Sites are requested to provide support for singularity containers and user namespaces. The latter can be checked by ensuring that /proc/sys/user/max_user_namespaces contains a large number.  
  
 
The underlying OS should provide the libraries, binaries, and scripts required by the current HEP_OSlibs RPM meta package.
 
The underlying OS should provide the libraries, binaries, and scripts required by the current HEP_OSlibs RPM meta package.
  
The shared software area shall be provided via CVMFS. LHCb uses the mount points /cvmfs/lhcb.cern.ch, /cvmfs/lhcb-condb.cern.ch, /cvmfs/grid.cern.ch and /cvmfs/cernvm-prod.cern.ch on the worker nodes.
+
The shared software area shall be provided via CVMFS. LHCb uses the mount points /cvmfs/lhcb.cern.ch, /cvmfs/lhcb-condb.cern.ch, /cvmfs/grid.cern.ch and /cvmfs/cernvm-prod.cern.ch on the worker nodes.  
  
 
Provisioning of a reasonable number of slots per disk server, proportional to the maximum number of concurrent jobs at the site.
 
Provisioning of a reasonable number of slots per disk server, proportional to the maximum number of concurrent jobs at the site.
Line 1,587: Line 1,847:
 
Non T1 sites providing CVMFS, direct HTCondorCE, ARC, or CREAM submission and the requested amount of local scratch space will be considered as candidates for additional workloads (e.g. data reprocessing campaign).
 
Non T1 sites providing CVMFS, direct HTCondorCE, ARC, or CREAM submission and the requested amount of local scratch space will be considered as candidates for additional workloads (e.g. data reprocessing campaign).
  
Sites not having an SRM installation must provide:
+
Sites not having an SRM installation must provide:
 
* disk only storage
 
* disk only storage
 
* a GRIDFPT endpoint (a single dns entry)
 
* a GRIDFPT endpoint (a single dns entry)
Line 1,633: Line 1,893:
 
|1000
 
|1000
 
|t2k.org software should be mounted on WNs via CVMFS as defined at https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS
 
|t2k.org software should be mounted on WNs via CVMFS as defined at https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS
 +
|-
 +
|virgo
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.complex-systems.eu
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.cta.in2p3.fr
 +
|0
 +
|0
 +
|2000
 +
|0
 +
|
 
|-
 
|-
 
|vo.northgrid.ac.uk
 
|vo.northgrid.ac.uk
Line 1,642: Line 1,923:
 
|-
 
|-
 
|vo.scotgrid.ac.uk
 
|vo.scotgrid.ac.uk
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|vo.southgrid.ac.uk
 
|0
 
|0
 
|0
 
|0
Line 1,655: Line 1,943:
 
|CVMFS is used for the software distribution via:
 
|CVMFS is used for the software distribution via:
  
:/cvmfs/zeus.desy.de
+
:/cvmfs/zeus.desy.de  
  
 
For setup instructions refer to:
 
For setup instructions refer to:
Line 1,690: Line 1,978:
  
 
Need also access to /cvmfs/geant4.cern.ch and /cvmfs/sft.cern.ch
 
Need also access to /cvmfs/geant4.cern.ch and /cvmfs/sft.cern.ch
 +
|-
 +
|comet.j-parc.jp
 +
|2048
 +
|1440
 +
|2880
 +
|40960
 +
|
 
|-
 
|-
 
|epic.vo.gridpp.ac.uk
 
|epic.vo.gridpp.ac.uk
Line 1,711: Line 2,006:
 
|1440
 
|1440
 
|10000
 
|10000
 +
|
 +
|-
 +
|cernatschool.org
 +
|0
 +
|0
 +
|0
 +
|0
 
|
 
|
 
|-
 
|-
Line 1,721: Line 2,023:
 
|-
 
|-
 
|vo.moedal.org
 
|vo.moedal.org
 +
|0
 +
|0
 +
|0
 +
|0
 +
|
 +
|-
 +
|lz
 
|0
 
|0
 
|0
 
|0
Line 1,728: Line 2,037:
 
|-
 
|-
 
|skatelescope.eu
 
|skatelescope.eu
 +
|0
 
|None
 
|None
 
|None
 
|None
|None
+
|0
|None
+
 
|
 
|
 
|-
 
|-
Line 1,741: Line 2,050:
 
|
 
|
 
|-
 
|-
|supernemo.org
+
|solidexperiment.org
|None
+
|0
|None
+
|0
|None
+
|0
|None
+
|0
|
+
|will need to set up CVMFS.
 
|}
 
|}
 
<!-- END OF RESOURCES -->
 
<!-- END OF RESOURCES -->

Revision as of 12:57, 21 April 2022

Questions

Here are a few starting questions I have about the the approved VO pages.

  1. Has the procedure changed for approving VOs?
  2. Is the "Cleanup Campaign" still running.
  3. Do we want just uk contacts or include all contacts in VO data
  4. Who are the missing UK VO contacts
  5. What is the "ops" VO
  6. Do you want email addresses on the contacts? Some have. If so, should they be obfuscated to stop email scraping?
  7. Some local VOs have links to the join url. Should they all?

Introduction

The GridPP Project Management Board has agreed that up to 10 % of GridPP's processing capability should be allocated for non-LHC work. VOs that access the Grid like this must become Approved VOs; policies for managing them are described here: Policies_for_GridPP_approved_VOs.

The tables below indicate VOs that the GridPP PMB has approved, and the PMB encourages support for these VOs at all of its collaborating sites. Information about all European Grid Initiative (EGI), global and local VOs is given in the Operations portal which is the main reference source for VO information (including VO manager, end-points, requirements etc.).

Yum repository

RPM versions of the VOMS records for Approved VOs are available via the VOMS RPMS Yum Repository.

VOMS RPM Repository v1.16-1TODO:Change RPM download location.


Please Note

Please do not change the vomsdir/ or vomses/ entries or the VO Resource Requirements section below, as they are automatically updated from the EGI Operations Portal. Any changes you make will be lost!

Cleanup Campaign

Approved EGI VOs

Name Area Contact
alice The ALICE Collaboration is operating a dedicated heavy-ion detector to exploit the unique physics potential of nucleus-nucleus interactions at LHC energies. Our aim is to study the physics of strongly interacting matter at extreme energy densities, where the formation of a new phase of matter, the quark-gluon plasma, is expected. Latchezar Betev
Maarten Litmaath
Costin Grigoras
White Rabit
atlas The ATLAS VO allow the members of the ATLAS collaboration to perform all the computing activities relevant for the ATLAS experiment, making use of the available resources following the policy defined by the Collaboration. Alessandro De Salvo
Simone Campana
Alessandro Di Girolamo
John Hover
Luca Vaccarossa
Elisabetta Vilucchi
John De Stefano Jr
James Walder
biomed This VO covers the areas related to health and life sciences. Currently, it is divided into 3 sectors: medical imaging, bioinformatics and drug discovery. The VO is openly accessible to academics, and to private company for non-commercial purposes. Tristan Glatard
Jerome Pansanel
Franck Michel
Sorina Camarasu
Tristan Glatard
cms The Compact Muon Solenoid (CMS) experiment is a large general-purpose particle physics detectors built on the proton-proton Large Hadron Collider (LHC) at CERN in Switzerland. Andreas Pfeiffer
Stefano Belforte
Stefano Belforte
Bonacorsi Daniele
Christoph Wissing
Elizabeth Sexton-Kennedy
Stephan Lammel
Jose Hernandez
Bonacorsi Daniele
Andrea Sciaba
Oliver Gutsche
dteam The goal of the VO is to facilitate the deployment of a stable production Grid infrastructure. To this end, members of this VO (who have to be associated with a registered site and be involved in its operation) are allowed to run tests to validate the correct configuration of their site. Site performance evaluation and/or monitoring programs may also be run under the DTEAM VO with the approval of the Site Manager, subject to the agreement of the affected sites' management. Kostas Koumantaros
Alessandro Paolini
Matthew Viljoen
Kyriakos Gkinis
Alessandro Paolini
esr The Earth Science Research covers research in the fields of Solid Earth, Ocean, Atmosphere and their interfaces. A large variety of communities correspond to each domain, some of them covering several domains.


In the ESR Virtual Organization (ESR-VO) four domains are represented:

  1. Earth Observation
  2. Climate
  3. Hydrology
  4. Solid Earth Physics
Andre Gemuend
David Weissenbach
David Weissenbach
David Weissenbach
geant4 Geant4 is a toolkit for the simulation of the passage of particles through matter. Its areas of application include high energy, nuclear and accelerator physics, as well as studies in medical and space science. The two main reference papers for Geant4 are published in Nuclear Instruments and Methods in Physics Research A 506 (2003) 250-303, and IEEE Transactions on Nuclear Science 53 No. 1 (2006) 270-278. Andrea Sciaba
Andrea Sciaba
Andrea Dotti
lhcb The LHCb (Large Hadron Collider Beauty) experiment is mainly set on finding the solution to the mistery of the matter-antimatter imbalance in the Universe. joel Closier
Andrew McNab
Concezio Bozzi
Ben Couturier
joel Closier
magic MAGIC is a system of two imaging atmospheric Cherenkov telescopes (or IACTs). MAGIC-I started routine operation after commissioning in 2004. Construction of MAGIC-II was completed in early 2009, and the two telescopes have been in operation ever since, with a break in 2012 for an upgrade that achieved full homogeneity. The project is funded primarily by the funding agencies BMFB (Germany), MPG (Germany), INFN (Italy), MICINN(Spain), and the ETH Zurich (Switzerland). Neissner Christian
Jose Luis Contreras
Roger Firpo
ops The goal of the VO is to facilitate the operations of the LCG/EGI infrastructure, which includes running official monitoring, re-certification and performance evaluation tools. Additionally the VO will be used for interoperations with other grid infrastructures. Emir Imamagic
Alessandro Paolini
Vincenzo Spinoso
t2k.org T2K is a neutrino experiment designed to investigate how neutrinos change from one flavour to another as they travel (neutrino oscillations). An intense beam of muon neutrinos is generated at the J-PARC nuclear physics site on the East coast of Japan and directed across the country to the Super-Kamiokande neutrino detector in the mountains of western Japan. The beam is measured once before it leaves the J-PARC site, using the near detector ND280, and again at Super-K: the change in the measured intensity and composition of the beam is used to provide information on the properties of neutrinos. Lukas Koch
Sophie King
Tomislav Vladisavljevic
vo.moedal.org The MoEDAL VO allows members of the MoEDAL Collaboration to perform all of the computing activities relevant for the MoEDAL experiment, making use of available resources according to the policy defined by the Collaboration. Tom Whyntie
Daniel Felea

Approved Global VOs

Name Area Contact
calice CAlorimeter for the LInear Collider Experiment

A high granularity calorimeter optimised for the Particle Flow measurement of multi-jets final state at the International Linear Collider running at a center-of-mass between 90 GeV and 1 TeV.

Roman Poeschl
Shaojun Lu
Andreas Gellrich
icecube The goal of the VO is to enable the usage of Grid resources for ICECUBE collaboration members, mainly for simulation and reconstruction. Damian Pieloth
ilc VO for the International Linear Collider Community. Frank Gaede
Andreas Gellrich
Christoph Wissing
ipv6.hepix.org The goal of the VO is to carry out testing of IPv6 readiness, functionality and performance of the middleware, applications and tools required by the stakeholder communities, especially HEP. Other authorised activities include use of the testbed by related IPv6 activities inside EGI, the related middleware technology providers and other Infrastructures used by WLCG/HEP. david kelsey
lsst Large Synoptic Survey Telescope or LSST is a large aperture wide field survey telescope and 3200 Megapixel camera to image faint astronomical objects, rapidly scan the sky and observe probes for dark matter and dark enegy. LSST Data Management and Simulation jobs will run on OSG and EGI. Christian Arnault
Dominique Boutigny
Gabriele Garzoglio
Iain Goodenow
Fabio Hernandez
na62.vo.gridpp.ac.uk The NA62 VO (na62.vo.gridpp.ac.uk) is meant to provide grid computing and data storage resources to the NA62 collaboration. The NA62 VO is supported by University of Cambridge, University of Glasgow, Imperial College London, University of Birmingham, University of Lancaster, University of Liverpool, University of Manchester, Oxford University and RAL (from UK), CERN, CNAF (Italy) and UCL (Belgium). More info about the NA62 experiment can be found on http://na62.web.cern.ch/na62/. The production portal is located at http://na62.gla.ac.uk/ Dan Protopopescu
David Britton
skatelescope.eu The Square Kilometre Array (SKA) project is an international effort to build the world’s largest radio telescope, with eventually over a square kilometre (one million square metres) of collecting area. The scale of the SKA represents a huge leap forward in both engineering and research & development towards building and delivering a unique instrument, with the detailed design and preparation now well under way. As one of the largest scientific endeavours in history, the SKA will bring together a wealth of the world’s finest scientists, engineers and policy makers to bring the project to fruition.

The vo skatelescope.eu is the vo supporting this project.

Alessandra Forti
Andrew McNab
Rohini Joshi
zeus ZEUS is a collaboration of about 450 physicists who are running a large particle detector at the electron-proton collider HERA at the DESY laboratory in Hamburg. The ZEUS detector is a sophisticated tool for studying the particle reactions provided by the high-energetic beams of the HERA accelerator. Thus the participating scientists are pushing forward our knowledge of the fundamental particles and forces of nature, gaining unsurpassed insight into the exciting laws of the microcosm. Andreas Gellrich

Approved Local VOs

Name Area Contact
cernatschool.org The CERN@school VO represents the CERN@school project on the Grid. CERN@school aims to bring CERN technology into the classroom to aid with the teaching of physics and to inspire the next generation of scientists and engineers. The CERN@school VO will allow students and teachers involved with the project to harness GridPP to store and analyse data from the CERN@school detectors, the LUCID experiment and the associated GEANT4 simulations.
epic.vo.gridpp.ac.uk EPIC replaces an earlier EPIC project that was focused upon Veterinary Surveillance (Phase 1). This new consortium EPIC project aims to become a world leader in policy linked research and includes some of Scotland’s leading veterinary epidemiologists and scientists.

The overarching purpose for the Centre is to provide access to high quality advice and analyses on the epidemiology and control of animal diseases that are important to Scotland, and to best prepare Scotland for the next major disease incursion. Ultimately, this strategic advice to the Scottish Government will help ensure that the interests of the various stakeholders involved in disease emergency planning and response are met as effectively as possible. This all must be achieved within the context of our rapidly changing environment. For example, issues such as climate change are now influencing the livestock disease risks that Scotland faces.

thomas doherty
gridpp GridPP is a collaboration of particle physicists and computer scientists from the UK and CERN. They are building a distributed computing Grid across the UK for particle physicists. At the moment there is a working particle physics Grid across 17 UK institutions. Coles Jeremy
hyperk.org We propose the Hyper-Kamiokande (Hyper-K) detector as a next generation underground water Cherenkov detector. It will serve as a far detector of a long baseline neutrino oscillation experiment envisioned for the upgraded J-PARC, and as a detector capable of observing -- far beyond the sensitivity of the Super-Kamiokande (Super-K) detector -- proton decays, atmospheric neutrinos, and neutrinos from astronomical origins. The baseline design of Hyper-K is based on the highly successful Super-K, taking full advantage of a well-proven technology. " Christopher Walker
Francesca di lodovico
mice A VO to support the activities of the Muon Ionisation Cooling Experiment (MICE). Specifically it is to enable the moving of MICE data around the Grid followed by the submission of analysis to these data. This is expected to be a small VO. David Colling
Paul Hodgson
Daniela Bauer
Janusz Martyniak
pheno Phenogrid is the VO for UK theorists that don't fit within one of the LHC experiments (e.g. developers of Monte Carlos etc.) The rest of this text exists only to satisfy the extremely unnecessary minimum limit of 200 characters. Jeppe Andersen
Adam Boutcher
Paul Clark
snoplus.snolab.ca VO for the snoplus experiment, a multi-purpose liquid scintillator neutrino experiment based in Sudbury, Canada. Members of the snoplus virtual organisation will contribute to the European computing effort to accurately simulate the SNOplus detector response. Jeanne Wilson
Christopher Walker
Matthew Mottram
solidexperiment.org support grid user of the SoLid experiment. Daniela Bauer
antonin vacheret
vo.northgrid.ac.uk Regional Virtual Organisation created to allow access to HEP resources to other local disciplines from Northgrid sites: Manchester, Lancaster, Liverpool, Sheffield. Users from these universities can apply. Alessandra Forti
Robert Frank
Robert Frank
vo.scotgrid.ac.uk The VO is for academic and other users in Scotland to test access to EGI resources. Users will join this VO before deciding whether to setup one of their own for long term access. It is also designed as a test VO to allow maintenance and operational testing of site services. Gareth Roy
vo.southgrid.ac.uk The VO is for academic and other users in the SouthGrid (UKI-SOUTHGRID-BHAM-HEP,UKI-SOUTHGRID-BRIS-HEP,UKI-SOUTHGRID-CAM-HEP,UKI-SOUTHGRID-OX-HEP,UKI-SOUTHGRID-RALPP, UKI-SOUTHGRID-SUSX) region to test access to EGI resources. Users will join this VO before deciding

whether to setup one of their own for long term access.

Peter Gronbech

Other VOs

This area can be used to record information about VOs that are site specific or localised in a region. This section can be used to advertise a local VO that you would like supported elsewhere.

Name Area Contact
vo.complex-systems.eu The goal of the vo.complex-systems.eu is to promote the study of

complex systems and complex networks on the Grid infrastructure. The vo.complex-systems.eu Virtual Organization will also serve as the building layer of collaboration among international scientists focusing on the research area of Complexity Science.

Romain Reuillon
comet.j-parc.jp Muon-to-electron conversion experiment at J-PARC, which will be used by international COMET collaborators for design studies and data analysis. COMET will test Beyond-the-Standard-Model physics in a way that is complementary to the experiments at the LHC. Daniela Bauer
Yoshi Uchida
Simon Fayer
vo.cta.in2p3.fr Monte Carlo simulations production and analysis for the "CTA - Cherenkov Telescopes Array"

international consortium.

Cecile Barbier
Luisa Arrabito
dune DUNE is the Deep Underground Neutrino Experiment managed by the global DUNE collaboration and hosted at Fermilab. We are building a deep-underground Liquid-Argon based neutrino detector to study accelerator-based neutrino oscillations, supernova neutrinos, and nucleon decay. Andrew McNab
Steve Timm
enmr.eu Structural biology and life sciences in general, and NMR in particular, have always been associated with advanced computing. The current challenges in the post-genomic era call for virtual research platforms that provide the worldwide research community with both user-friendly tools, platforms for data analysis and exchange, and an underlying e-infrastructure. WeNMR groups different research teams into a worldwide virtual research community. It builds on the established eNMR e-Infrastructure and its steadily growing virtual organization, which is currently the second largest VO in the area of life sciences. WeNMR provides an e-Infrastructure platform and Science Gateway for structural biology towards EGI for the users of existing infrastructures. It involves researchers from around the world and will build bridges to other areas of structural biology. Integration with SAXS, a rapidly growing and highly complementary method, is directly included in WeNMR, but links will also be established to related initiatives. WeNMR will serve all relevant INSTRUCT communities in line with the ESFRI roadmap. Marco Verlato
Alexandre Bonvin
Antonio Rosato
Henry Jonker
Andrea Giachetti
Marco Verlato
fermilab Fermilab Virtual Organization (VO) - The Fermilab VO is an "umbrella" VO that includes the Fermilab Campus Grid (FermiGrid) and Fermilab Grid Testing (ITB) infrastructures, and all Fermilab computing activities that are not big enough to have their own Virtual Organization. Broadly these include the intensity frontier program, theoretical simulations, fixed target analysis, and accelerator and beamline design as well as activities performed by the Fermilab Campus Grid administrators. Gabriele Garzoglio
Joe Boyd
lz This VO will support LUX Zeplin experiment designed to search Dark Matter. Sridhara Dasu
Daniel Bradley
Carl Vuosalo
Elena Korolkova
James Dobson
virgo Scientific target: detection of gravitational waves. Gravitational waves are predicted by the General Theory of Relativity but still not directly detected due to their extremely weak interaction with matter. Large interferometric detectors, like Virgo, are operating with the aim of directly detecting gravitational signals from various astrophysical sources. Signals are expected to be deeply buried into detector noise and suitable data analysis algorithm are developed in order to allow detection and signal parameter estimation. For many kind of searches large computing resources are needed and in some important cases we are computationally bound: the larger is the available computing power and the wider is the portion of source parameter space that can be explored.

VO target: to allow data management and computationally intensive data analysis

Cristiano Palomba
Alberto Colla
vo.landslides.mossaic.org A virtual organisation for landslide modellers associated with the Management of Slope Stability in Communities (MoSSaiC) project. The VO is used for running landslide modelling software such as CHASM and QUESTA. Lukasz Kreczko

Approved VOs being established into GridPP infrastructure

As part of its commitment to various projects, the GridPP PMB has approved the establishment of the following VOs (your site can not yet support these but when the VO is setup and functioning we will let you know.)

Name Area Contact

VOs that have been removed from approved list

The table below comprises a history of VOs that have been removed from the approved list for various reasons.

Name Date of removal Notes
babar 9 Oct 2013 none
camont 7th June 2017 none
camont.gridpp.ac.uk 9 Oct 2013 none
cdf 7th June 2017 none
cedar 9 Oct 2013 none
dzero 7th June 2017 none
fusion 30 Jan 2017 Discussion with Rubén Vallés Pérez. VO appears defunct.
hone 24 Nov 2015 Discussed at Ops Meeting. Defunct.
ltwo 9 Oct 2013 none
minos.vo.gridpp.ac.uk 9 Oct 2013 none
na48 9 Oct 2013 none
neiss 7th June 2017 none


ngs.ac.uk 9 Oct 2013 none
superbvo.org 19 Jan 2016 Discussed at Ops Meeting. Defunct.
supernemo.vo.eu-egee.org 24 Feb 2020 now called supernemo.org
totalep 9 Oct 2013 none
vo.londongrid.ac.uk in progress [GGUS] VO not used any more
vo.sixt.cern.ch 11 Nov 2015 No members, no voms servers, defunct

Example site-info.def entries

The examples of site-info.def entries for yaim have been moved: Example site-info.def entries

Please Note

Please do not change the vomsdir/ or vomses/ entries below, as they are automatically updated from the EGI Operations Portal. Any changes you make will be lost!


Virtual Organisation: ALICE

Filename: /etc/grid-security/vomsdir/alice/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/alice/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/alice-lcg-voms2.cern.ch

"alice" "lcg-voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "alice"

Filename: /etc/vomses/alice-voms2.cern.ch

"alice" "voms2.cern.ch" "15000" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "alice"

Notes: n/a


Virtual Organisation: ATLAS

Filename: /etc/grid-security/vomsdir/atlas/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/atlas/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/atlas/voms-atlas-auth.app.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/atlas-lcg-voms2.cern.ch

"atlas" "lcg-voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "atlas"

Filename: /etc/vomses/atlas-voms2.cern.ch

"atlas" "voms2.cern.ch" "15001" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "atlas"

Filename: /etc/vomses/atlas-voms-atlas-auth.app.cern.ch

"atlas" "voms-atlas-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=atlas-auth.web.cern.ch" "atlas"

Notes: n/a


Virtual Organisation: BIOMED

Filename: /etc/grid-security/vomsdir/biomed/cclcgvomsli01.in2p3.fr.lsc

/O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr
/C=FR/O=MENESR/OU=GRID-FR/CN=AC GRID-FR Services

Filename: /etc/vomses/biomed-cclcgvomsli01.in2p3.fr

"biomed" "cclcgvomsli01.in2p3.fr" "15000" "/O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr" "biomed"

Notes: n/a


Virtual Organisation: CALICE

Filename: /etc/grid-security/vomsdir/calice/grid-voms.desy.de.lsc

/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
/C=DE/O=GermanGrid/CN=GridKa-CA

Filename: /etc/vomses/calice-grid-voms.desy.de

"calice" "grid-voms.desy.de" "15102" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "calice"

Notes: n/a


Virtual Organisation: CMS

Filename: /etc/grid-security/vomsdir/cms/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/cms/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/cms/voms-cms-auth.app.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/cms-lcg-voms2.cern.ch

"cms" "lcg-voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "cms"

Filename: /etc/vomses/cms-voms2.cern.ch

"cms" "voms2.cern.ch" "15002" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "cms"

Filename: /etc/vomses/cms-voms-cms-auth.app.cern.ch

"cms" "voms-cms-auth.app.cern.ch" "443" "/DC=ch/DC=cern/OU=computers/CN=cms-auth.web.cern.ch" "cms"

Notes: n/a


Virtual Organisation: DTEAM

Filename: /etc/grid-security/vomsdir/dteam/voms2.hellasgrid.gr.lsc

/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr
/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016

Filename: /etc/vomses/dteam-voms2.hellasgrid.gr

"dteam" "voms2.hellasgrid.gr" "15004" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "dteam"

Notes: n/a


Virtual Organisation: ENMR.EU

Filename: /etc/grid-security/vomsdir/enmr.eu/voms-02.pd.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms-02.pd.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/grid-security/vomsdir/enmr.eu/voms2.cnaf.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms2.cnaf.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/enmr.eu-voms-02.pd.infn.it

"enmr.eu" "voms-02.pd.infn.it" "15014" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms-02.pd.infn.it" "enmr.eu"

Filename: /etc/vomses/enmr.eu-voms2.cnaf.infn.it

"enmr.eu" "voms2.cnaf.infn.it" "15014" "/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms2.cnaf.infn.it" "enmr.eu"

Notes: n/a


Virtual Organisation: ESR

Filename: /etc/grid-security/vomsdir/esr/voms.grid.sara.nl.lsc

/DC=org/DC=terena/DC=tcs/C=NL/L=Utrecht/O=SURF B.V./CN=voms1.grid.surfsara.nl
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/esr-voms.grid.sara.nl

"esr" "voms.grid.sara.nl" "30001" "/DC=org/DC=terena/DC=tcs/C=NL/L=Utrecht/O=SURF B.V./CN=voms1.grid.surfsara.nl" "esr"

Notes: n/a


Virtual Organisation: GEANT4

Filename: /etc/grid-security/vomsdir/geant4/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/geant4/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/geant4-lcg-voms2.cern.ch

"geant4" "lcg-voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "geant4"

Filename: /etc/vomses/geant4-voms2.cern.ch

"geant4" "voms2.cern.ch" "15007" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "geant4"

Notes: n/a


Virtual Organisation: GRIDPP

Filename: /etc/grid-security/vomsdir/gridpp/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/gridpp/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/gridpp/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/gridpp-voms.gridpp.ac.uk

"gridpp" "voms.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "gridpp"

Filename: /etc/vomses/gridpp-voms02.gridpp.ac.uk

"gridpp" "voms02.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "gridpp"

Filename: /etc/vomses/gridpp-voms03.gridpp.ac.uk

"gridpp" "voms03.gridpp.ac.uk" "15000" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "gridpp"

Notes: n/a


Virtual Organisation: ICECUBE

Filename: /etc/grid-security/vomsdir/icecube/grid-voms.desy.de.lsc

/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
/C=DE/O=GermanGrid/CN=GridKa-CA

Filename: /etc/vomses/icecube-grid-voms.desy.de

"icecube" "grid-voms.desy.de" "15106" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "icecube"

Notes: n/a


Virtual Organisation: ILC

Filename: /etc/grid-security/vomsdir/ilc/grid-voms.desy.de.lsc

/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
/C=DE/O=GermanGrid/CN=GridKa-CA

Filename: /etc/vomses/ilc-grid-voms.desy.de

"ilc" "grid-voms.desy.de" "15110" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "ilc"

Notes: n/a


Virtual Organisation: LHCB

Filename: /etc/grid-security/vomsdir/lhcb/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/lhcb/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/lhcb-lcg-voms2.cern.ch

"lhcb" "lcg-voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "lhcb"

Filename: /etc/vomses/lhcb-voms2.cern.ch

"lhcb" "voms2.cern.ch" "15003" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "lhcb"

Notes: n/a


Virtual Organisation: MAGIC

Notes: n/a


Virtual Organisation: OPS

Filename: /etc/grid-security/vomsdir/ops/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/ops/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/ops-lcg-voms2.cern.ch

"ops" "lcg-voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "ops"

Filename: /etc/vomses/ops-voms2.cern.ch

"ops" "voms2.cern.ch" "15009" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "ops"

Notes: n/a


Virtual Organisation: PHENO

Filename: /etc/grid-security/vomsdir/pheno/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/pheno/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/pheno/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/pheno-voms.gridpp.ac.uk

"pheno" "voms.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "pheno"

Filename: /etc/vomses/pheno-voms02.gridpp.ac.uk

"pheno" "voms02.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "pheno"

Filename: /etc/vomses/pheno-voms03.gridpp.ac.uk

"pheno" "voms03.gridpp.ac.uk" "15011" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "pheno"

Notes: n/a


Virtual Organisation: SNOPLUS.SNOLAB.CA

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/snoplus.snolab.ca/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/snoplus.snolab.ca-voms.gridpp.ac.uk

"snoplus.snolab.ca" "voms.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "snoplus.snolab.ca"

Filename: /etc/vomses/snoplus.snolab.ca-voms02.gridpp.ac.uk

"snoplus.snolab.ca" "voms02.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "snoplus.snolab.ca"

Filename: /etc/vomses/snoplus.snolab.ca-voms03.gridpp.ac.uk

"snoplus.snolab.ca" "voms03.gridpp.ac.uk" "15503" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "snoplus.snolab.ca"

Notes: n/a


Virtual Organisation: T2K.ORG

Filename: /etc/grid-security/vomsdir/t2k.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/t2k.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/t2k.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/t2k.org-voms.gridpp.ac.uk

"t2k.org" "voms.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "t2k.org"

Filename: /etc/vomses/t2k.org-voms02.gridpp.ac.uk

"t2k.org" "voms02.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "t2k.org"

Filename: /etc/vomses/t2k.org-voms03.gridpp.ac.uk

"t2k.org" "voms03.gridpp.ac.uk" "15003" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "t2k.org"

Notes: n/a


Virtual Organisation: VIRGO

Filename: /etc/grid-security/vomsdir/virgo/voms-01.pd.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms-01.pd.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/grid-security/vomsdir/virgo/voms.cnaf.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms.cnaf.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/virgo-voms-01.pd.infn.it

"virgo" "voms-01.pd.infn.it" "15009" "/DC=org/DC=terena/DC=tcs/C=IT/ST=Roma/O=Istituto Nazionale di Fisica Nucleare/CN=voms-01.pd.infn.it" "virgo"

Filename: /etc/vomses/virgo-voms.cnaf.infn.it

"virgo" "voms.cnaf.infn.it" "15009" "/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms.cnaf.infn.it" "virgo"

Notes: n/a


Virtual Organisation: VO.COMPLEX-SYSTEMS.EU

Filename: /etc/grid-security/vomsdir/vo.complex-systems.eu/voms2.hellasgrid.gr.lsc

/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr
/C=GR/O=HellasGrid/OU=Certification Authorities/CN=HellasGrid CA 2016

Filename: /etc/vomses/vo.complex-systems.eu-voms2.hellasgrid.gr

"vo.complex-systems.eu" "voms2.hellasgrid.gr" "15160" "/C=GR/O=HellasGrid/OU=hellasgrid.gr/CN=voms2.hellasgrid.gr" "vo.complex-systems.eu"

Notes: n/a


Virtual Organisation: VO.CTA.IN2P3.FR

Filename: /etc/grid-security/vomsdir/vo.cta.in2p3.fr/cclcgvomsli01.in2p3.fr.lsc

/O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr
/C=FR/O=MENESR/OU=GRID-FR/CN=AC GRID-FR Services

Filename: /etc/vomses/vo.cta.in2p3.fr-cclcgvomsli01.in2p3.fr

"vo.cta.in2p3.fr" "cclcgvomsli01.in2p3.fr" "15008" "/O=GRID-FR/C=FR/O=CNRS/OU=CC-IN2P3/CN=cclcgvomsli01.in2p3.fr" "vo.cta.in2p3.fr"

Notes: n/a


Virtual Organisation: VO.NORTHGRID.AC.UK

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.northgrid.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.northgrid.ac.uk-voms.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.northgrid.ac.uk"

Filename: /etc/vomses/vo.northgrid.ac.uk-voms02.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms02.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.northgrid.ac.uk"

Filename: /etc/vomses/vo.northgrid.ac.uk-voms03.gridpp.ac.uk

"vo.northgrid.ac.uk" "voms03.gridpp.ac.uk" "15018" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.northgrid.ac.uk"

Notes: n/a


Virtual Organisation: VO.SCOTGRID.AC.UK

Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.scotgrid.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.scotgrid.ac.uk-voms.gridpp.ac.uk

"vo.scotgrid.ac.uk" "voms.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.scotgrid.ac.uk"

Filename: /etc/vomses/vo.scotgrid.ac.uk-voms02.gridpp.ac.uk

"vo.scotgrid.ac.uk" "voms02.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.scotgrid.ac.uk"

Filename: /etc/vomses/vo.scotgrid.ac.uk-voms03.gridpp.ac.uk

"vo.scotgrid.ac.uk" "voms03.gridpp.ac.uk" "15509" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.scotgrid.ac.uk"

Notes: n/a


Virtual Organisation: VO.SOUTHGRID.AC.UK

Filename: /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.southgrid.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.southgrid.ac.uk-voms.gridpp.ac.uk

"vo.southgrid.ac.uk" "voms.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.southgrid.ac.uk"

Filename: /etc/vomses/vo.southgrid.ac.uk-voms02.gridpp.ac.uk

"vo.southgrid.ac.uk" "voms02.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.southgrid.ac.uk"

Filename: /etc/vomses/vo.southgrid.ac.uk-voms03.gridpp.ac.uk

"vo.southgrid.ac.uk" "voms03.gridpp.ac.uk" "15019" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.southgrid.ac.uk"

Notes: n/a


Virtual Organisation: ZEUS

Filename: /etc/grid-security/vomsdir/zeus/grid-voms.desy.de.lsc

/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de
/C=DE/O=GermanGrid/CN=GridKa-CA

Filename: /etc/vomses/zeus-grid-voms.desy.de

"zeus" "grid-voms.desy.de" "15112" "/C=DE/O=GermanGrid/OU=DESY/CN=host/grid-voms.desy.de" "zeus"

Notes: n/a


Virtual Organisation: MICE

Filename: /etc/grid-security/vomsdir/mice/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/mice/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/mice/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/mice-voms.gridpp.ac.uk

"mice" "voms.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "mice"

Filename: /etc/vomses/mice-voms02.gridpp.ac.uk

"mice" "voms02.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "mice"

Filename: /etc/vomses/mice-voms03.gridpp.ac.uk

"mice" "voms03.gridpp.ac.uk" "15001" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "mice"

Notes: n/a


Virtual Organisation: VO.LANDSLIDES.MOSSAIC.ORG

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/vo.landslides.mossaic.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/vo.landslides.mossaic.org-voms.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "vo.landslides.mossaic.org"

Filename: /etc/vomses/vo.landslides.mossaic.org-voms02.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms02.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "vo.landslides.mossaic.org"

Filename: /etc/vomses/vo.landslides.mossaic.org-voms03.gridpp.ac.uk

"vo.landslides.mossaic.org" "voms03.gridpp.ac.uk" "15502" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "vo.landslides.mossaic.org"

Notes: n/a


Virtual Organisation: IPV6.HEPIX.ORG

Filename: /etc/grid-security/vomsdir/ipv6.hepix.org/voms2.cnaf.infn.it.lsc

/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms2.cnaf.infn.it
/C=NL/O=GEANT Vereniging/CN=GEANT eScience SSL CA 4

Filename: /etc/vomses/ipv6.hepix.org-voms2.cnaf.infn.it

"ipv6.hepix.org" "voms2.cnaf.infn.it" "15013" "/DC=org/DC=terena/DC=tcs/C=IT/L=Frascati/O=Istituto Nazionale di Fisica Nucleare/OU=CNAF/CN=voms2.cnaf.infn.it" "ipv6.hepix.org"

Notes: n/a


Virtual Organisation: NA62.VO.GRIDPP.AC.UK

Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/na62.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms.gridpp.ac.uk

"na62.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"

Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms02.gridpp.ac.uk

"na62.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"

Filename: /etc/vomses/na62.vo.gridpp.ac.uk-voms03.gridpp.ac.uk

"na62.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15501" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "na62.vo.gridpp.ac.uk"

Notes: n/a


Virtual Organisation: COMET.J-PARC.JP

Filename: /etc/grid-security/vomsdir/comet.j-parc.jp/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/comet.j-parc.jp/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/comet.j-parc.jp/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/comet.j-parc.jp-voms.gridpp.ac.uk

"comet.j-parc.jp" "voms.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "comet.j-parc.jp"

Filename: /etc/vomses/comet.j-parc.jp-voms02.gridpp.ac.uk

"comet.j-parc.jp" "voms02.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "comet.j-parc.jp"

Filename: /etc/vomses/comet.j-parc.jp-voms03.gridpp.ac.uk

"comet.j-parc.jp" "voms03.gridpp.ac.uk" "15505" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "comet.j-parc.jp"

Notes: n/a


Virtual Organisation: EPIC.VO.GRIDPP.AC.UK

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/epic.vo.gridpp.ac.uk/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms.gridpp.ac.uk" "15507" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms02.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms02.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Filename: /etc/vomses/epic.vo.gridpp.ac.uk-voms03.gridpp.ac.uk

"epic.vo.gridpp.ac.uk" "voms03.gridpp.ac.uk" "15027" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "epic.vo.gridpp.ac.uk"

Notes: n/a


Virtual Organisation: LSST

Filename: /etc/grid-security/vomsdir/lsst/voms.slac.stanford.edu.lsc

/DC=org/DC=incommon/C=US/ST=California/L=Stanford/O=Stanford University/OU=SLAC/CN=voms.slac.stanford.edu
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/vomses/lsst-voms.slac.stanford.edu

"lsst" "voms.slac.stanford.edu" "15003" "/DC=org/DC=incommon/C=US/ST=California/L=Stanford/O=Stanford University/OU=SLAC/CN=voms.slac.stanford.edu" "lsst"

Notes: n/a


Virtual Organisation: HYPERK.ORG

Filename: /etc/grid-security/vomsdir/hyperk.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/hyperk.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/hyperk.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/hyperk.org-voms.gridpp.ac.uk

"hyperk.org" "voms.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "hyperk.org"

Filename: /etc/vomses/hyperk.org-voms02.gridpp.ac.uk

"hyperk.org" "voms02.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "hyperk.org"

Filename: /etc/vomses/hyperk.org-voms03.gridpp.ac.uk

"hyperk.org" "voms03.gridpp.ac.uk" "15510" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "hyperk.org"

Notes: n/a


Virtual Organisation: CERNATSCHOOL.ORG

Filename: /etc/grid-security/vomsdir/cernatschool.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/cernatschool.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/cernatschool.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/cernatschool.org-voms.gridpp.ac.uk

"cernatschool.org" "voms.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "cernatschool.org"

Filename: /etc/vomses/cernatschool.org-voms02.gridpp.ac.uk

"cernatschool.org" "voms02.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "cernatschool.org"

Filename: /etc/vomses/cernatschool.org-voms03.gridpp.ac.uk

"cernatschool.org" "voms03.gridpp.ac.uk" "15500" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "cernatschool.org"

Notes: n/a


Virtual Organisation: FERMILAB

Filename: /etc/grid-security/vomsdir/fermilab/voms1.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/grid-security/vomsdir/fermilab/voms2.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/vomses/fermilab-voms1.fnal.gov

"fermilab" "voms1.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov" "fermilab"

Filename: /etc/vomses/fermilab-voms2.fnal.gov

"fermilab" "voms2.fnal.gov" "15001" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov" "fermilab"

Notes: n/a


Virtual Organisation: VO.MOEDAL.ORG

Filename: /etc/grid-security/vomsdir/vo.moedal.org/lcg-voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/grid-security/vomsdir/vo.moedal.org/voms2.cern.ch.lsc

/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch
/DC=ch/DC=cern/CN=CERN Grid Certification Authority

Filename: /etc/vomses/vo.moedal.org-lcg-voms2.cern.ch

"vo.moedal.org" "lcg-voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=lcg-voms2.cern.ch" "vo.moedal.org"

Filename: /etc/vomses/vo.moedal.org-voms2.cern.ch

"vo.moedal.org" "voms2.cern.ch" "15017" "/DC=ch/DC=cern/OU=computers/CN=voms2.cern.ch" "vo.moedal.org"

Notes: n/a


Virtual Organisation: LZ

Notes: n/a


Virtual Organisation: SKATELESCOPE.EU

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/skatelescope.eu/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/skatelescope.eu-voms.gridpp.ac.uk

"skatelescope.eu" "voms.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "skatelescope.eu"

Filename: /etc/vomses/skatelescope.eu-voms02.gridpp.ac.uk

"skatelescope.eu" "voms02.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "skatelescope.eu"

Filename: /etc/vomses/skatelescope.eu-voms03.gridpp.ac.uk

"skatelescope.eu" "voms03.gridpp.ac.uk" "15512" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "skatelescope.eu"

Notes: n/a


Virtual Organisation: DUNE

Filename: /etc/grid-security/vomsdir/dune/voms1.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/grid-security/vomsdir/dune/voms2.fnal.gov.lsc

/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov
/C=US/O=Internet2/OU=InCommon/CN=InCommon IGTF Server CA

Filename: /etc/vomses/dune-voms1.fnal.gov

"dune" "voms1.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms1.fnal.gov" "dune"

Filename: /etc/vomses/dune-voms2.fnal.gov

"dune" "voms2.fnal.gov" "15042" "/DC=org/DC=incommon/C=US/ST=Illinois/O=Fermi Research Alliance/OU=Fermilab/CN=voms2.fnal.gov" "dune"

Notes: n/a


Virtual Organisation: SOLIDEXPERIMENT.ORG

Filename: /etc/grid-security/vomsdir/solidexperiment.org/voms.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/solidexperiment.org/voms02.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/grid-security/vomsdir/solidexperiment.org/voms03.gridpp.ac.uk.lsc

/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk
/C=UK/O=eScienceCA/OU=Authority/CN=UK e-Science CA 2B

Filename: /etc/vomses/solidexperiment.org-voms.gridpp.ac.uk

"solidexperiment.org" "voms.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Manchester/L=HEP/CN=voms.gridpp.ac.uk" "solidexperiment.org"

Filename: /etc/vomses/solidexperiment.org-voms02.gridpp.ac.uk

"solidexperiment.org" "voms02.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Oxford/L=OeSC/CN=voms02.gridpp.ac.uk" "solidexperiment.org"

Filename: /etc/vomses/solidexperiment.org-voms03.gridpp.ac.uk

"solidexperiment.org" "voms03.gridpp.ac.uk" "15513" "/C=UK/O=eScience/OU=Imperial/L=Physics/CN=voms03.gridpp.ac.uk" "solidexperiment.org"

Notes: n/a




VO Resource Requirements

Please Note

Please do not change the table below as it is automatically updated from the EGI Operations Portal. Any changes you make will be lost.


VO Ram/Core MaxCPU MaxWall Scratch Other
alice 2000 1320 1500 10000
atlas 2048 5760 5760 20000 Additional runtime requirements:
  • at least 4GB of VM for each job slot

Software installation common items:

  • the full compiler suite (c/c++ and fortran) should be installed in the WNs, including all the compat-gcc-32* and the SL_libg2c.a_change packages in SL4-like nodes;
  • the reccommended version of the compilers is 3.4.6;
  • the f2c and libgfortran libraries (in both i386 and x86_64 versions, in case of x86_64 systems) are also required to run the software;
  • other libraries required are:
libpopt.so.0
libblas.so

Software installation setup (cvmfs sites):

Software installation requirements (non-cvmfs sites):

  • an experimental software area (shared filesystem) with at least 500 GB free and reserved for ATLAS.
biomed 100 1 1 100 For sites providing an SE, minimal required storage space is 1TB.
calice 2048 3600 5400 15000 CVMFS is used for the software distribution via:
/cvmfs/calice.desy.de

For setup instructions refer to:

http://grid.desy.de/cvmfs
cms 2000 2880 4320 20000 Note: CMS usually sends 8-core pilots, values for 'Multi Core' refer to that. Single-core pilots are discouraged.

Jobs require an address space larger than the memory size specified above. Sites should allow processes to use at least 6GB of virtual address space more per core than memory to accommodate the large amount of shared libraries used by jobs. (For a typical 8-core pilot that would translate into a VZSIZE limit of at least 64GB.)

Cloud resources should provision 8-core VMs to match standard 8-core pilots.

Input I/O requirement is an average 2.5 MB/s per thread from MSS.

All jobs need to have outbound connectivity.

Sites must not use pool accounts for the FQAN cms:/cms/Role=lcgadmin . For any other CMS job, sites need to use pool accounts so that at any time every grid credential is mapped to an independent local account.


National VOMS groups: In CMS national VOMS groups, e.g. /cms/becms or /cms/dcms, are used. Those proxies must be "supported" at all sites in the following way:

  • should be treated like /cms (base group), in case no special treated is wanted by the site
  • proxies with such national groups must be able to write to /store/user/temp (the PFN associated to this LFN)
dteam 0 None None 0
enmr.eu 8000 2880 4320 1000
  1. For COVID-19 related jobs, slots with 8 GB/Core are required
  1. WeNMR software area must be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS and https://wiki.egi.eu/wiki/PROC22. Please do not forget to define on all WNs the environment variable VO_ENMR_EU_SW_DIR=/cvmfs/wenmr.egi.eu, as pointed out in the above documents.
  1. The line:

"/enmr.eu/*":::: has to be added to group.conf file before configuring via yaim the grid services. In the CREAM-CE this reflects in the lines: "/enmr.eu/*/Role=NULL/Capability=NULL" .enmr "/enmr.eu/*" .enmr of both the files /etc/grid-security/grid-mapfile and /etc/grid-security/voms-grid-mapfile, and in the lines: "/enmr.eu/*/Role=NULL/Capability=NULL" enmr "/enmr.eu/*" enmr of the file /etc/grid-security/groupmapfile. It is required to enable whatever VO group added for implementing per-application accounting.

esr 2048 2100 0 0 Many applications only need part of the following. Java/Perl/Python/C/C++/FORTRAN77,-90,-95; IDL and MATLAB runtime; Scilab or Octave. Needs MPI for some applications.

Some applications require access to job output during execution, some even interaction via X11. 1 GB RAM; some applications need 3 GB RAM. Outbound connectivity from WN to databases. Shared file system needed for MPI applications, with about 10 GB of space. There are applications needing about 1000 simultaneously open files. Depending on application, output file sizes from some MB to 5 GB, for a total of several hundred thousand files. No permanent storage needed but transient and durable. Low-latency scheduling for short jobs needed.

geant4 1000 650 850 300 Software is distributed via CernVM-FS

(http://cernvm.cern.ch/portal/filesystem), configuration should include geant4.cern.ch<http://geant4.cern.ch> and dependency (sft.cern.ch<http://sft.cern.ch>, grid.cern.ch<http://grid.cern.ch>) areas.

CernVM-FS needs to be accessed on WN. CernVM-FS Cache area needed is about 5GB.

gridpp 1000 1000 0 0
icecube 4000 2880 2880 40000 CVMFS is used for the software distribution via:

/cvmfs/icecube.opensciencegrid.org

ilc 2048 3600 5400 15000 CVMFS is used for the software distribution via:
/cvmfs/ilc.desy.de

For setup instructions refer to:

http://grid.desy.de/cvmfs
lhcb 0 0 0 20000 Further recommendations from LHCb for sites:

The amount of memory in the field "Max used physical non-swap X86_64 memory size" of the resources section is understood to be the virtual memory required per single process of a LHCb payload. Usually LHCb payloads consist of one "worker process", consuming the majority of memory, and several wrapper processes. The total amount of virtual memory for all wrapper processes accounts for 1 GB which needs to be added as a requirement to the field "Max used physical non-swap X86_64 memory size" in case the virtual memory of the whole process tree is monitored.

The amount of space in field "Max size of scratch space used by jobs", shall be interpreted as 50 % each for downloaded input files and produced output files.

Sites should have the Centos7 or "Cern Centos7" operating system, or later versions, installed on their worker nodes. Sites are requested to provide support for singularity containers and user namespaces. The latter can be checked by ensuring that /proc/sys/user/max_user_namespaces contains a large number.

The underlying OS should provide the libraries, binaries, and scripts required by the current HEP_OSlibs RPM meta package.

The shared software area shall be provided via CVMFS. LHCb uses the mount points /cvmfs/lhcb.cern.ch, /cvmfs/lhcb-condb.cern.ch, /cvmfs/grid.cern.ch and /cvmfs/cernvm-prod.cern.ch on the worker nodes.

Provisioning of a reasonable number of slots per disk server, proportional to the maximum number of concurrent jobs at the site.

Non T1 sites providing CVMFS, direct HTCondorCE, ARC, or CREAM submission and the requested amount of local scratch space will be considered as candidates for additional workloads (e.g. data reprocessing campaign).

Sites not having an SRM installation must provide:

magic 1024 5000 0 0 Fortran77 and other compilers. See details in annex of MoU (documentation section).
ops 0 0 0 0
pheno 0 0 0 0
snoplus.snolab.ca 2000 1440 2160 20000 g++

gcc python-devel uuid-devel zlib-devel

SNO+ software area should be mounted on the WNs through CVMFS as described in https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS.

t2k.org 1500 600 600 1000 t2k.org software should be mounted on WNs via CVMFS as defined at https://www.gridpp.ac.uk/wiki/RALnonLHCCVMFS
virgo 0 0 0 0
vo.complex-systems.eu 0 0 0 0
vo.cta.in2p3.fr 0 0 2000 0
vo.northgrid.ac.uk 0 0 0 0
vo.scotgrid.ac.uk 0 0 0 0
vo.southgrid.ac.uk 0 0 0 0
zeus 2048 3600 5400 5000 CVMFS is used for the software distribution via:
/cvmfs/zeus.desy.de

For setup instructions refer to:

http://grid.desy.de/cvmfs
mice 0 0 0 0
vo.landslides.mossaic.org 0 0 0 0
ipv6.hepix.org 0 0 0 0
na62.vo.gridpp.ac.uk 2048 500 720 2048 VO_NA62_VO_GRIDPP_AC_UK_SW_DIR=/cvmfs/na62.cern.ch

Need also access to /cvmfs/geant4.cern.ch and /cvmfs/sft.cern.ch

comet.j-parc.jp 2048 1440 2880 40960
epic.vo.gridpp.ac.uk 0 0 0 0
lsst 0 0 0 0 VO name must be "lsst" as it is an existing VO in OSG!

cf VOMS URL

hyperk.org 0 1440 1440 10000
cernatschool.org 0 0 0 0
fermilab 0 0 0 0
vo.moedal.org 0 0 0 0
lz 0 0 0 0
skatelescope.eu 0 None None 0
dune 0 2880 2880 10000
solidexperiment.org 0 0 0 0 will need to set up CVMFS.


VO enablement

The VOs that are enabled at each site are listed in a VO table.

This page is a Key Document, and is the responsibility of Gerard Hand. It was last reviewed on 2022-04-01 when it was considered to be 0% complete. It was last judged to be accurate on never.

DRAFT