[Hps-analysis] Requests for data cooking
Tongtong Cao
tongtongcao1 at gmail.com
Mon Jul 22 20:30:51 EDT 2019
Hi Rafo,
I am confused.
In your file mkdirs.sh, the sub-folder “logs” is included besides “recon” and “rootTree".
So the sub-folder “logs” is created at the beginning of job running for each cooked run.
Actually, the sub-folder “logs” currently exists for each cooked run, but is empty. I think it confuses Norman at first.
Do you mean that a job still crashes even if a directory for log files is built at the beginning of job running?
Best regards,
Tongtong
> On Jul 22, 2019, at 7:50 PM, Rafayel Paremuzyan <rafopar at jlab.org> wrote:
>
> Hi Tongtong,
>
>>
>> This bug in the job submission file will be fixed for future runs.
> Please don't change that part of the code, it might brake job submission.
>
> Before starting the job, it checks if the directory for log files exists, and if not, it crashes without even staring the job.
> These directories are created inside the job, when job start.
>
> Instead it might be useful, to make another script that can move logs to appropriate directories, however
> logs have Run and File number in the name, perhaps at least for now it is ok to leave it as it is now.
>
> Rafo
>
>
>
> On 7/22/19 5:59 PM, Tongtong Cao wrote:
>> Hello Norman,
>>
>> Thank you for the feedback.
>> Currently, all log files for all runs are in /volatile/hallb/hps/data/run2019/ConcurrentCook/logs
>> I will let cook-processing be finished, and copy log files into the sub-folder “logs” each run.
>>
>> This bug in the job submission file will be fixed for future runs.
>>
>> Best regards,
>> Tongtong
>>
>>> On Jul 22, 2019, at 4:57 PM, Graf, Norman A. <ngraf at slac.stanford.edu <mailto:ngraf at slac.stanford.edu>> wrote:
>>>
>>> Hello Tongtong,
>>>
>>> I'm seeing output files showing up in recon and rootTree, but nothing in log. Where are the log files being written to?
>>>
>>> Norman
>>>
>>>
>>> From: Hps-analysis <hps-analysis-bounces at jlab.org <mailto:hps-analysis-bounces at jlab.org>> on behalf of Tongtong Cao <tongtongcao1 at gmail.com <mailto:tongtongcao1 at gmail.com>>
>>> Sent: Monday, July 22, 2019 8:21 AM
>>> To: hps-software <hps-software at slac.stanford.edu <mailto:hps-software at slac.stanford.edu>>; hps-analysis at jlab.org <mailto:hps-analysis at jlab.org> <hps-analysis at jlab.org <mailto:hps-analysis at jlab.org>>
>>> Subject: Re: [Hps-analysis] Requests for data cooking
>>>
>>> Dear all,
>>>
>>> Yesterday, we got a lot of available runs.
>>> I picked up most of runs from 9908 - 9929 for cooking.
>>> The jobs has been submitted and been running.
>>>
>>> jar is up to date.
>>> -d: HPS-PhysicsRun2019-v1-4pt5
>>> -x: PhysicsRun2019_testRecon.lcsim
>>> -R: ${run}
>>>
>>> Please let me know if you need extra runs or special requirements for setup.
>>>
>>> Best regards,
>>> Tongtong
>>>
>>>> On Jul 17, 2019, at 11:09 AM, Tongtong Cao <tongtongcao1 at gmail.com <mailto:tongtongcao1 at gmail.com>> wrote:
>>>>
>>>> Dear Colleagues,
>>>>
>>>> According to the run list in Run Spreadsheet 2019, I categorized runs (collected so far) as “Ecal Pedestal”, “SVT OFF” and “SVT ON” after dropping junk runs.
>>>> A table is attached.
>>>>
>>>> There are two steps for cooking:
>>>> Step 1: Decoding and reconstruction using the up-to-date “SNAPSHOT-bin.jar"
>>>> Parameter setup in the current jsub.xml:
>>>> 1) -d (detector): HPS-PhysicsRun2019-v1-4pt5
>>>> 2) -x (steering file): PhysicsRun2019_testRecon.lcsim for “SVT ON” runs, and PhysicsRun2019_NoSVT.lcsim for “SVT OFF” runs
>>>> 3) -R (run number): ${run} (run number of processed run)
>>>>
>>>> Step 2: Conversion from LCIO file into ROOT file using Make_Root.cc/ <https://urldefense.proofpoint.com/v2/url?u=http-3A__make-5Froot.cc_&d=DwMFaQ&c=CJqEzB1piLOyyvZjb8YUQw&r=JaSEOiNc_6InrJmbYDvKU2tZqhhONpIkbyl_AUnkDSY&m=imhV5FCWvlGLUemhiBv0sF-fTU-DjXMtsLWq0xoKhQU&s=7Zx7VM-macFh5pAaXncHKZJ5M87F2RhLyPR7rnF4xlk&e=>Make_Root_NoSVT.cc <https://urldefense.proofpoint.com/v2/url?u=http-3A__make-5Froot-5Fnosvt.cc_&d=DwMFaQ&c=CJqEzB1piLOyyvZjb8YUQw&r=JaSEOiNc_6InrJmbYDvKU2tZqhhONpIkbyl_AUnkDSY&m=imhV5FCWvlGLUemhiBv0sF-fTU-DjXMtsLWq0xoKhQU&s=DlZvvsHV9epdNTG1viDJzwRdRIMOmbDYOkyweGSUEuk&e=> for convenience of C++ users
>>>>
>>>> Other setup in the current jsub.xml:
>>>> # of processed files for a given run: all files if # < 15, 15 files (5 at first, 5 in the middle, and 5 at the end) if # >= 15
>>>> Destination of output: /volatile/hallb/hps/data/run2019/ConcurrentCook/${run}; /volatile/hallb/hps/data/run2019/ConcurrentCookfail/${run} (if failed)
>>>>
>>>> To request a run/runs for data cooking, please provide run number(s) and special requests, like # of processed files or a list of processed files for a run, output storage for the long-term usage (the default “volatile" storage temporarily keep output), etc.
>>>>
>>>> So far, runs 9569 (SVT OFF) and 9600 (SVT ON) have been cooked.
>>>>
>>>> ps: Please add notes for each run during shifts.
>>>>
>>>> Best regards,
>>>> Tongtong
>>>> <catagoriesOfRuns2019.xlsx>
>>
>>
>>
>> _______________________________________________
>> Hps-analysis mailing list
>> Hps-analysis at jlab.org <mailto:Hps-analysis at jlab.org>
>> https://mailman.jlab.org/mailman/listinfo/hps-analysis <https://mailman.jlab.org/mailman/listinfo/hps-analysis>
>
> _______________________________________________
> Hps-analysis mailing list
> Hps-analysis at jlab.org
> https://mailman.jlab.org/mailman/listinfo/hps-analysis
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.jlab.org/pipermail/hps-analysis/attachments/20190722/871e3676/attachment-0001.html>
More information about the Hps-analysis
mailing list