[Hps-analysis] high statistics tritrig-wab-beam
Yale, Bradley T
btu29 at wildcats.unh.edu
Thu Jan 18 13:34:19 EST 2018
tritrig-wab-beam DSTs readout 10-to-1 should be around 4MB per file, from what I've seen.
Trident tuples are around 0.5 MB each for tritrig-beam, and up to 2MB for the highest-acceptance A's, if you include a lot of extra vertex patch variables.
________________________________
From: Hps-analysis <hps-analysis-bounces at jlab.org> on behalf of Maruyama, Takashi <tvm at slac.stanford.edu>
Sent: Thursday, January 18, 2018 12:22:59 PM
To: 'Nathan Baltzell'
Cc: hps-analysis at jlab.org
Subject: Re: [Hps-analysis] high statistics tritrig-wab-beam
Hi Nathan,
Since we are in the middle of testing the tuple/DST makers, I don't know the file size of tuple/DST. Bradley might know. The recon file size is 400 MB per file and we need 10,000 recon files to be equivalent to the 2015 data statistics. So the total disk space requirement is 4 TB.
The tuple maker can generate three tuples, FEE, Moller and trident. Since tritrig-wab-beam is processed by pair1 trigger, I would think there are not many FEE, Moller in the tritrig-wab-beam recon files. If we want FEE and Moller samples, a separate readout/recon step must be run on wab-beam.slicio. Do we need FEE and Moller? If we don't need FEE, Moller, I will delete the wab-beam.slcio files as each wab-beam.slcio is 450 MB and we need 100,000 files, 45 TB.
Takashi
-----Original Message-----
From: Nathan Baltzell [mailto:baltzell at jlab.org]
Sent: Tuesday, January 16, 2018 4:08 PM
To: Maruyama, Takashi
Cc: hps-analysis at jlab.org
Subject: Re: [Hps-analysis] high statistics tritrig-wab-beam
What’s the estimate on disk space requirements for the different bits in #1?
-nathan
> On Jan 16, 2018, at 7:03 PM, Maruyama, Takashi <tvm at slac.stanford.edu> wrote:
>
> High statistics tritrig-wab-beam production is in progress. About 15% of 2015 data equivalent statistics has been generated, but the production is paused due to lack of disk space. As soon as a disk space becomes available, the production will resume. There are a couple of issues. 1) What files do we need to transfer to JLab? Only recon files, or only Tuple files, or DST as well? 2) We need a contact person at JLab who finds the disk space and does the transfer. 3) Since the intermediate wab-beam.slcio is large, requiring 5 TB for 10,000 files (~100,000 files to be 2015 data equivalent), we need to delete these files once the recon files are made. But wab-beam could be useful to generate, for example, wab-beam-tri. How long do we need to keep wab-beam? I would delete the files once the recons, tuples, and DSTs are made.
>
> Takashi
>
> _______________________________________________
> Hps-analysis mailing list
> Hps-analysis at jlab.org
> https://mailman.jlab.org/mailman/listinfo/hps-analysis
_______________________________________________
Hps-analysis mailing list
Hps-analysis at jlab.org
https://mailman.jlab.org/mailman/listinfo/hps-analysis
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.jlab.org/pipermail/hps-analysis/attachments/20180118/e57a2ed9/attachment.html>
More information about the Hps-analysis
mailing list