[Hps-analysis] high statistics tritrig-wab-beam
Nathan Baltzell
baltzell at jlab.org
Thu Jan 18 18:30:57 EST 2018
Ok, let’s say 5 TB, probably too much to comfortably transfer to /work or /volatile without some significant cleanup there first, so first choice would be straight to /cache.
Does SLAC still have a globus endpoint? Anyone used globus to write to /cache at JLab?
-Nathan
> On Jan 18, 2018, at 12:22 PM, Maruyama, Takashi <tvm at slac.stanford.edu> wrote:
>
> Hi Nathan,
>
> Since we are in the middle of testing the tuple/DST makers, I don't know the file size of tuple/DST. Bradley might know. The recon file size is 400 MB per file and we need 10,000 recon files to be equivalent to the 2015 data statistics. So the total disk space requirement is 4 TB.
>
> The tuple maker can generate three tuples, FEE, Moller and trident. Since tritrig-wab-beam is processed by pair1 trigger, I would think there are not many FEE, Moller in the tritrig-wab-beam recon files. If we want FEE and Moller samples, a separate readout/recon step must be run on wab-beam.slicio. Do we need FEE and Moller? If we don't need FEE, Moller, I will delete the wab-beam.slcio files as each wab-beam.slcio is 450 MB and we need 100,000 files, 45 TB.
>
> Takashi
>
> -----Original Message-----
> From: Nathan Baltzell [mailto:baltzell at jlab.org]
> Sent: Tuesday, January 16, 2018 4:08 PM
> To: Maruyama, Takashi
> Cc: hps-analysis at jlab.org
> Subject: Re: [Hps-analysis] high statistics tritrig-wab-beam
>
> What’s the estimate on disk space requirements for the different bits in #1?
>
> -nathan
>
>
>
>> On Jan 16, 2018, at 7:03 PM, Maruyama, Takashi <tvm at slac.stanford.edu> wrote:
>>
>> High statistics tritrig-wab-beam production is in progress. About 15% of 2015 data equivalent statistics has been generated, but the production is paused due to lack of disk space. As soon as a disk space becomes available, the production will resume. There are a couple of issues. 1) What files do we need to transfer to JLab? Only recon files, or only Tuple files, or DST as well? 2) We need a contact person at JLab who finds the disk space and does the transfer. 3) Since the intermediate wab-beam.slcio is large, requiring 5 TB for 10,000 files (~100,000 files to be 2015 data equivalent), we need to delete these files once the recon files are made. But wab-beam could be useful to generate, for example, wab-beam-tri. How long do we need to keep wab-beam? I would delete the files once the recons, tuples, and DSTs are made.
>>
>> Takashi
>>
>> _______________________________________________
>> Hps-analysis mailing list
>> Hps-analysis at jlab.org
>> https://mailman.jlab.org/mailman/listinfo/hps-analysis
>
More information about the Hps-analysis
mailing list