Difference between revisions of "GridPP5 Tier2 plans"

From GridPP Wiki
Jump to: navigation, search
(Other links)
(Sites batch system status)
Line 10: Line 10:
 
# Site name
 
# Site name
 
# Batch/CE system (the main batch system and CE you are intending to use in GridPP5. This might be one that you are testing as a replacement for, say, Torque/CREAM)
 
# Batch/CE system (the main batch system and CE you are intending to use in GridPP5. This might be one that you are testing as a replacement for, say, Torque/CREAM)
# Shared, non-CE? (Is the batch system shared with users who don’t access it through the grid CE?)
+
# Shared, non-CE? Yes/No (Is the batch system shared with users who don’t access it through the grid CE?)
# Shared filesystem? (Do users rely on a shared filesystem? e.g. Lustre. i.e. that couldn’t be replaced with local filesystems on worker nodes.)
+
# Shared filesystem? (Do users rely on a shared filesystem? e.g. Lustre. i.e. that couldn’t be replaced with local filesystems on worker nodes. Which one?)
# Non-LHC, non GridPP DIRAC VOs? (Do you support VOs, e.g. from EGI, that aren’t LHC experiments or use the GridPP DIRAC service. Please list the top 3.)
+
# Non-LHC, non GridPP DIRAC VOs? (Do you support VOs, e.g. from EGI, that aren’t LHC experiments and don't use the GridPP DIRAC service. Please list the top 3.)
 
# Non-LHC storage? (Do you provide storage to non-LHC projects? Please list the top 3.)
 
# Non-LHC storage? (Do you provide storage to non-LHC projects? Please list the top 3.)
  
Line 107: Line 107:
 
|-
 
|-
 
|UKI-NORTHGRID-MAN-HEP
 
|UKI-NORTHGRID-MAN-HEP
|.
+
|HTCondor/ARC
|.
+
|No
|.
+
|No
|.
+
|Biomed, ILC, Icecube
 
|.
 
|.
 
|.
 
|.

Revision as of 10:20, 14 March 2017

Other links

Sites batch system status

This page has been setup to collect information from GridPP sites regarding their batch systems in February 2014. The information will help with wider considerations and strategy. The table seeks the following:

  1. Site name
  2. Batch/CE system (the main batch system and CE you are intending to use in GridPP5. This might be one that you are testing as a replacement for, say, Torque/CREAM)
  3. Shared, non-CE? Yes/No (Is the batch system shared with users who don’t access it through the grid CE?)
  4. Shared filesystem? (Do users rely on a shared filesystem? e.g. Lustre. i.e. that couldn’t be replaced with local filesystems on worker nodes. Which one?)
  5. Non-LHC, non GridPP DIRAC VOs? (Do you support VOs, e.g. from EGI, that aren’t LHC experiments and don't use the GridPP DIRAC service. Please list the top 3.)
  6. Non-LHC storage? (Do you provide storage to non-LHC projects? Please list the top 3.)


Site Batch/CE system Shared, non-CE? Shared filesystem? Non-LHC, non GridPP DIRAC VOs? Non-LHC storage? . . . Notes
UKI-LT2-Brunel . . . . . . . .
UKI-LT2-IC-HEP . . . . . . . .


UKI-LT2-QMUL . . . . . . . .
UKI-LT2-RHUL . . . . . . . .


UKI-NORTHGRID-LANCS-HEP . . . . . . . .
UKI-NORTHGRID-LIV-HEP . . . . . . . .
UKI-NORTHGRID-MAN-HEP HTCondor/ARC No No Biomed, ILC, Icecube . . . .


UKI-NORTHGRID-SHEF-HEP . . . . . . . .


UKI-SCOTGRID-DURHAM . . . . . . . .


UKI-SCOTGRID-ECDF . . . . . . . .


UKI-SCOTGRID-GLASGOW . . . . . . . .
UKI-SOUTHGRID-BHAM-HEP . . . . . . . .


UKI-SOUTHGRID-BRIS . . . . . . . .


UKI-SOUTHGRID-CAM-HEP . . . . . . . .
UKI-SOUTHGRID-OX-HEP . . . . . . . .


UKI-SOUTHGRID-RALPP . . . . . . . .


UKI-SOUTHGRID-SUSX . . . . . . . .