DIRAC Data Handling within a Job
Contents
Glossary
VO: Virtual Organization.
LFN: Logical File Name: How DIRAC locates files. Must start with the VO name.
Dealing with Input data
InputSandbox
The InputSandbox can contain local files that you would like to ship with the job (try and stay below 10MB for this) and LFNs of data that you want to process within a job:
JDL Example: InputSandbox = {"diractest.sh", "LFN:/lz/user/dirac01.test/dirac01.testfile.txt"};
API example: job.setInputSandbox(["diractest.sh", "LFN:/lz/user/dirac01.test/dirac01.testfile.txt"])
LFNs must be indicated by the LFN pre-fix.
DIRAC will automatically stage the data to the worker node for you.
InputData
Inputdata must be specified as an LFN (or list of LFNs), without any pre-fixes.
JDL Example: InputData = {"/lz/user/dirac01.test/dirac01.testfile.txt"};
API Example: job.setInputData(['/lz/user/dirac01.test/dirac01.testfile.txt'])
If you specify InputData (as opposed to InputSandbox) your jobs will only run at sites where the data is present. If you give a list of LFNs and no site hosts *all* of them DIRAC will return an error.
DIRAC will automatically stage the data to the worker node for you.
Dealing with Output data
Note: Please ensure that you always specify the storage element you want your data to be written to. This must be a storage element you have write access to. We strongly recommend to upload a file to your chosen storage element by hand before you submit a job to check that the permissions on the storage element are set correctly.
Using JDLs
- Specify OutputSE and OutputData
Example: Your job produces files ending in txt and dat and you want all of these files to go to the storage element at Imperial:
OutputSE = "UKI-LT2-IC-HEP-disk"; OutputData = { "*.txt", "*.dat" };
By default these files are uploaded to the user area (/voname/user/initial/your.dirac.username/), into a directory that references the job number (here '700'):
In my example this is: /gridpp/user/d/daniela.bauer/0/700/
FC:/> ls /gridpp/user/d/daniela.bauer/0/700/ testfile.1591886794.2.LCG.UKI-LT2-IC-HEP.uk.txt testfile.1591886794.LCG.UKI-LT2-IC-HEP.uk.dat testfile.1591886794.LCG.UKI-LT2-IC-HEP.uk.txt
- Specify OutputSE, OutputData and OutputPath (recommended)
Example:
OutputSE = "UKI-LT2-IC-HEP-disk"; OutputPath = "/special_path"; OutputData = { "*.txt", "*.dat" };
Now the files will still be in the user area, but located in the directory you specified:
FC:/> ls /gridpp/user/d/daniela.bauer/special_path/ testfile.1591886807.2.LCG.UKI-LT2-IC-HEP.uk.txt testfile.1591886807.LCG.UKI-LT2-IC-HEP.uk.dat testfile.1591886807.LCG.UKI-LT2-IC-HEP.uk.txt
- Specify full LFN (no wildcards possible)
Example: Your job produces three outputfiles. The first two need to go in one directory, the last one into a different one.
- Do everything by hand inside your job. In this case you do not need to specify anything in your JDL, put please make sure you include at least one retry for your file upload, to deal with intermittent problems:
Example (bash):
dirac-dms-add-file /gridpp/daniela.bauer/outputexamples/txt/testfile1.txt testfile1.txt UKI-LT2-IC-HEP-disk # if you do this, please always check if your upload has worked, and of not, try again in a couple of minutes, as failures are often transient if [ $? -ne 0 ]; then # wait 5 min and try again sleep 300 dirac-dms-add-file /gridpp/daniela.bauer/outputexamples/txt/testfile1.txt testfile1.txt UKI-LT2-IC-HEP-disk if [ $? -ne 0 ]; then echo "Upload failed even in second try." fi fi
Using the Python API
This assumes you have a job object (do they call it that in python?):
from DIRAC.Interfaces.API.Job import Job job = Job()
- Specify output storage element and output data only (wildcards possible). As above, output will appear in /voname/user/initial/your.dirac.username/[]/jobnumber.
job.setOutputData(['*.txt', '*.dat'], outputSE='UKI-LT2-IC-HEP-disk')
FC:/> ls /gridpp/user/d/daniela.bauer/0/718 testfile.1592236946.2.LCG.UKI-LT2-IC-HEP.uk.txt testfile.1592236946.LCG.UKI-LT2-IC-HEP.uk.dat testfile.1592236946.LCG.UKI-LT2-IC-HEP.uk.txt
- Specify output storage element, directory and output data.
job.setOutputData(['*.txt', '*.dat'], outputSE='UKI-LT2-IC-HEP-disk', outputPath='/api/specialpath')
Your data will appear in
FC:/> ls /gridpp/user/d/daniela.bauer/api/specialpath/ testfile.1591976938.2.LCG.UKI-LT2-Brunel.uk.txt testfile.1591976938.LCG.UKI-LT2-Brunel.uk.dat testfile.1591976938.LCG.UKI-LT2-Brunel.uk.txt
- Specify the full LFN. No wildcards possible.
job.setOutputData(["LFN:/gridpp/dbauer/testlfnapi/txt/testfile.lfn1.txt", "LFN:/gridpp/dbauer/testlfnapi/txt/testfile.lfn2.txt", "LFN:/gridpp/dbauer/testlfnapi/dat/testfile.lfn3.dat"], outputSE='UKI-LT2-IC-HEP-disk')
Your data will appear in
FC:/> ls /gridpp/dbauer/testlfnapi/txt/ testfile.lfn1.txt testfile.lfn2.txt FC:/> ls /gridpp/dbauer/testlfnapi/dat testfile.lfn3.dat
- Do everything by hand. Oh dear. You can either keep doing it in your bash wrapper (if you have one) or - and I haven't tested this - if you wanted to write this in python, you probably want to start by looking at putAndRegister in the DataManagementSystem. It might end up looking like something along these lines:
from DIRAC.DataManagementSystem.Client.DataManager import DataManager dm = DataManager() lfn = "/gridpp/user/d/daniela.bauer/moretests/testfile2.txt" res = dm.putAndRegister(lfn, "testfile2.txt", "UKI-LT2-IC-HEP-disk", overwrite=True) if not res['OK']: print("Oh dear, failed to upload %s" %lfn) else: print("Upload successful.")