[Halld-offline] [EXTERNAL] auxilliary data file needed for running with HADR=4
Thomas Britton
tbritton at jlab.org
Fri May 15 12:49:32 EDT 2020
Is there a material not found? Medium 11?
Haven’t seen that before. Can you modify the MakeMC to dump your environment right before the block of logging info. Compare interactive to the one run on the farm. My guess is there is some small difference between the two....
Thomas Britton
On May 15, 2020, at 12:45 PM, Colin Gleason <gleasonc at jlab.org> wrote:
No crashes- it runs fine without any errors. Logs are located at /volatile/halld/home/gleasonc/pippimeta/Simulation/2017-01_4.19_G3/HADR4_b/log. Looking carefully at onne of the log files, I find:
settofg: RF reference plane set to 65.0000000 cm
10 events simulated
20 events simulated
30 events simulated
40 events simulated
50 events simulated
60 events simulated
70 events simulated
80 events simulated
90 events simulated
100 events simulated
200 events simulated
MICAP GTMED: GEANT Medium 11 not found ==> STOP
RUNNING MCSMEAR
0
10000
It looks like this is where things go wrong, as only ~250 of the ~2k generated events are getting processed.
This issue, or STOP, does not appear when I run interactively. Running interactively would show:
**** NUMBER OF EVENTS PROCESSED = 10000
**** RANDOM NUMBER GENERATOR AFTER LAST COMPLETE EVENT 2120846575 460615182
**** TIME TO PROCESS ONE EVENT IS = 0.0529 SECONDS
MZEND. Usage statistics for 2 dynamic stores.
Map of store 0 /GCBANK/
------------------------
Division Number of times
Kind Max-size Garb-coll.
Mode Position used allowed Wiped user auto Pushd Redcd
1 QDIV1 0 1 5203 0 4999968 0 0 0 0 0
2 QDIV2 1 1 2267094 1019031 4999968 10000 0 0 0 0
19 system 1 8 2341269 68180 4999968 0 0 0 0 0
20 Constant 1 2 4999968 2635035 4000000 0 0 141 150 0
Map of store 1 /PAWC/
------------------------
Division Number of times
Kind Max-size Garb-coll.
Mode Position used allowed Wiped user auto Pushd Redcd
1 QDIV1 0 1 2 0 4999585 0 0 0 0 0
2 QDIV2 1 1 4998185 1100 4999585 0 0 0 0 0
19 system 1 8 4998585 111 4999585 0 0 0 0 0
20 HIGZ 0 4 4998585 0 5000000 0 0 0 0 0
RZEND. called for RZFILE
***** Normal exit from Hall D GEANT *****
RUNNING MCSMEAR
2
10000
On Fri, May 15, 2020 at 12:34 PM Thomas Britton <tbritton at jlab.org<mailto:tbritton at jlab.org>> wrote:
This is tough. From MCwrapper’s POV it does nothing special when that card is in any given state. Do you mean it crashes? We’ll need some stack traces.
Thomas Britton
Staff Scientist, Scientific Computing
Jefferson Lab
From: Colin Gleason<mailto:gleasonc at jlab.org>
Sent: Friday, May 15, 2020 12:29 PM
To: Richard Jones<mailto:richard.t.jones at uconn.edu>; Thomas Britton<mailto:tbritton at jlab.org>
Cc: HallD Software Group<mailto:HallD-Offline at jlab.org>
Subject: Re: [EXTERNAL] auxilliary data file needed for running with HADR=4
Hi all,
As a follow up to this and to keep a record for others, I can not get the HADR 4 option to work when submitting jobs to the ifarm via MCWrapper batch=2. I get a reasonable output when I run the job interactively, as seen above. I'm using the same environment, but I'm guessing somehow the correct file is not being found in batch mode. I do not see anything that stands out in the log files. Thomas, any suggestions or ideas on why MCWrapper does not recognize this file? My CERNLIB is located at /group/halld/Software/builds/Linux_CentOS7.7-x86_64-gcc4.8.5/cernlib/2005/lib.
-Colin
On Wed, May 13, 2020 at 9:48 AM Colin Gleason <gleasonc at jlab.org<mailto:gleasonc at jlab.org>> wrote:
Thanks for the information, Richard. Sure enough, I do not have $CERNLIB defined using the standard build. As a workaround to what Sean suggested, and since I am usig C shell, I added
setenv CERNLIB /group/halld/Software/builds/Linux_CentOS7.7-x86_64-gcc4.8.5/cernlib/2005/lib
to my setup_gluex.csh file that gets sourced when calling my environment. When I tested with 1k generated events, I got 89 reconstructed. Before that, as in without flukaaf.dat, I was getting 8. This compares to the ~7% reconstructed events using HADR=1. I will rerun my simulation using this setup.
-Ccolin
On Wed, May 13, 2020 at 8:53 AM Richard Jones <richard.t.jones at uconn.edu<mailto:richard.t.jones at uconn.edu>> wrote:
Colin and all,
An input data file flukaaf.dat is a needed input file for running hdgeant with HADR=4. This file usually resides in $CERNLIB. If CERNLIB is defined on your system, then things should work. If not, then you should be sure to place a copy of flukaaf.dat in the directory from which you run hdgeant. A copy of this is found in the halld-sim repo in the halld software area, or on cvmfs, or on github at the link provided below. Take your pick, they should all be the same.
* /cvmfs/oasis.opensciencegrid.org/gluex/group/halld/Software/builds/Linux_CentOS7-x86_64-gcc4.8.5-cntr/cernlib/2005/lib/flukaaf.dat<https://urldefense.proofpoint.com/v2/url?u=http-3A__oasis.opensciencegrid.org_gluex_group_halld_Software_builds_Linux-5FCentOS7-2Dx86-5F64-2Dgcc4.8.5-2Dcntr_cernlib_2005_lib_flukaaf.dat&d=DwMFaQ&c=CJqEzB1piLOyyvZjb8YUQw&r=_p_Zf1fP8D9GtGtaEgIPjViLbHJLGv_RV32KAKGtMbE&m=2yyxB2SUMQDNM7BpmwdbEPaT6IB-5Cn9bHhxQqrrQKg&s=63gpT7ZWLQ0R-Wa6hSh-zaVffpwWqaywCNXMmII8XB8&e=>
* /u/group/halld/Software/builds/Linux_CentOS7-x86_64-gcc4.8.5/halld_sim/halld_sim-4.13.0/src/programs/Simulation/HDGeant/flukaaf.dat
* https://github.com/JeffersonLab/halld_sim/blob/master/src/programs/Simulation/HDGeant/flukaaf.dat<https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_JeffersonLab_halld-5Fsim_blob_master_src_programs_Simulation_HDGeant_flukaaf.dat&d=DwMFaQ&c=CJqEzB1piLOyyvZjb8YUQw&r=_p_Zf1fP8D9GtGtaEgIPjViLbHJLGv_RV32KAKGtMbE&m=2yyxB2SUMQDNM7BpmwdbEPaT6IB-5Cn9bHhxQqrrQKg&s=S73Xf0DOqe72vg3PyfcLRlVQxN1qwLRuyhuySiXPaDU&e=>
-Richard Jones
--
Colin Gleason
Postdoctoral Fellow
Indiana University
Department of Physics
--
Colin Gleason
Postdoctoral Fellow
Indiana University
Department of Physics
--
Colin Gleason
Postdoctoral Fellow
Indiana University
Department of Physics
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.jlab.org/pipermail/halld-offline/attachments/20200515/9f65c21a/attachment-0002.html>
More information about the Halld-offline
mailing list