[Hps-analysis] Requests for data cooking

Graf, Norman A. ngraf at slac.stanford.edu
Mon Jul 22 16:57:56 EDT 2019


Hello Tongtong,


I'm seeing output files showing up in recon and rootTree, but nothing in log. Where are the log files being written to?


Norman


________________________________
From: Hps-analysis <hps-analysis-bounces at jlab.org> on behalf of Tongtong Cao <tongtongcao1 at gmail.com>
Sent: Monday, July 22, 2019 8:21 AM
To: hps-software <hps-software at slac.stanford.edu>; hps-analysis at jlab.org <hps-analysis at jlab.org>
Subject: Re: [Hps-analysis] Requests for data cooking

Dear all,

Yesterday, we got a lot of available runs.
I picked up most of runs from 9908 - 9929 for cooking.
The jobs has been submitted and been running.

jar is up to date.
-d: HPS-PhysicsRun2019-v1-4pt5
-x: PhysicsRun2019_testRecon.lcsim
-R: ${run}

Please let me know if you need extra runs or special requirements for setup.

Best regards,
Tongtong

On Jul 17, 2019, at 11:09 AM, Tongtong Cao <tongtongcao1 at gmail.com<mailto:tongtongcao1 at gmail.com>> wrote:

Dear Colleagues,

According to the run list in Run Spreadsheet 2019, I categorized runs (collected so far) as “Ecal Pedestal”, “SVT OFF” and “SVT ON” after dropping junk runs.
A table is attached.

There are two steps for cooking:
  Step 1: Decoding and reconstruction using the up-to-date “SNAPSHOT-bin.jar"
      Parameter setup in the current jsub.xml:
           1) -d (detector): HPS-PhysicsRun2019-v1-4pt5
           2) -x (steering file): PhysicsRun2019_testRecon.lcsim for “SVT ON” runs, and PhysicsRun2019_NoSVT.lcsim for “SVT OFF” runs
           3) -R (run number): ${run} (run number of processed run)

  Step 2: Conversion from LCIO file into ROOT file using Make_Root.cc/<https://urldefense.proofpoint.com/v2/url?u=http-3A__make-5Froot.cc_&d=DwMFaQ&c=CJqEzB1piLOyyvZjb8YUQw&r=JaSEOiNc_6InrJmbYDvKU2tZqhhONpIkbyl_AUnkDSY&m=imhV5FCWvlGLUemhiBv0sF-fTU-DjXMtsLWq0xoKhQU&s=7Zx7VM-macFh5pAaXncHKZJ5M87F2RhLyPR7rnF4xlk&e=>Make_Root_NoSVT.cc<https://urldefense.proofpoint.com/v2/url?u=http-3A__make-5Froot-5Fnosvt.cc_&d=DwMFaQ&c=CJqEzB1piLOyyvZjb8YUQw&r=JaSEOiNc_6InrJmbYDvKU2tZqhhONpIkbyl_AUnkDSY&m=imhV5FCWvlGLUemhiBv0sF-fTU-DjXMtsLWq0xoKhQU&s=DlZvvsHV9epdNTG1viDJzwRdRIMOmbDYOkyweGSUEuk&e=> for convenience of C++ users

Other setup in the current jsub.xml:
  # of processed files for a given run: all files if # < 15, 15 files (5 at first, 5 in the middle, and 5 at the end) if # >= 15
  Destination of output: /volatile/hallb/hps/data/run2019/ConcurrentCook/${run};  /volatile/hallb/hps/data/run2019/ConcurrentCookfail/${run} (if failed)

To request a run/runs for data cooking, please provide run number(s) and special requests, like # of processed files or a list of processed files for a run, output storage for the long-term usage (the default “volatile" storage temporarily keep output), etc.

So far, runs 9569 (SVT OFF) and 9600 (SVT ON) have been cooked.

ps: Please add notes for each run during shifts.

Best regards,
Tongtong
<catagoriesOfRuns2019.xlsx>






-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.jlab.org/pipermail/hps-analysis/attachments/20190722/f2d0ba8b/attachment-0001.html>


More information about the Hps-analysis mailing list